That’s an interesting thought but remember that it counts correctly 99% of the time. It’s only breaking on large subscriber feeds very rarely. Those feeds will fetch every 5 minutes and have dozens of new stories in a day.
My guess is that the unread counter routine is being killed due to taking too long and it may stop in the middle of a recount. But it shouldn’t blow away your read stories.
I just fixed the other issue that’s been new since July 1st, so this is the last issue to fix. I’m thinking about this constantly and may try to implement some additional checks next week.
Looks like the Verge just did it within the past 2 hours or so. I’ve got 500 unread right now after marking everything read just a few hours ago.
Yup, 500 unread on the Verge… And I had some items intentionally left unread, never finding those again…
Happened to TechCrunch
Ok, I"m going to try something new and for your two accounts, @leonick and @boyblunder, provided your NewsBlur usernames are the same, I’m going to monitor your subscription to The Verge (NewsBlur) and see how the unread counts change over time and what happens to the mark as read date.
So for the next week, I want both of you to remember when you last hit mark feed as read (probably not at all considering you both keep unreads, and thats great, I just want to know you have or have not hit it this week). And when it next resets the count, let me know ASAP. This will help us get to the bottom of this.
Ok, neither of you subscribe to this The Verge feed: NewsBlur. Which one do you subscribe to?
https://newsblur.com/site/6643112/the-verge-all-posts for me.
Happened to BoingBoing again overnight on the East coast. Definitely after 8 PM and before 8 AM today. I don’t keep unreads so if you want to watch mine as well that’s fine.
I’m also on
Won’t hit mark all as read, as you said I’m unlikely to anyway as I save some items. Might take take more than a week for an unread flip, but I’ll post here if/when it happens.
I got an big unread flip for 9to5Mac.
I also follow 9to5Mac but did not get an unread flip (at least not yet).
It seems like it’s happening based on the feed url you’re following?
The The Verge ‘sadly’, but
https://newsblur.com/site/7808652/macstories has flipped everything in the last 30 days to unread…
Happened to TechCrunch, sometime overnight.
When this happens, it flips every story back to 30 days as unread? You all have premium accounts?
Hold on, I just noticed something. When you read these feeds as single feeds and not in a folder, does the feed ever show up blank? And no other feeds do except for these problematic feeds? Blank meaning you load it and it has no stories. And after 5-10 seconds loading the feed turns up stories again?
Yep, I think I just figured out what happened. The redis sync process is no longer atomic, and during one of the many fetches, your client may be performing an unread story count, and with 0 stories in the db temporarily, everything gets rolled back.
I’m going to work on finding a solution, but I’m positive this is the culprit.
And that will do it. A fix has been deployed!
Let me know if you see this happen again. I’m nearly certain this will fix the issue.
Good to hear you’ve probably found the issue. Will let you know if it happens again.
Just in case, for your questions:
As I recall The Verge is typically 500 unread items, so it might be hitting some other limit, otherwise, yea, 30 days.
While I’ve certainly had feeds not loading any items for several seconds now and the, that has definitely happened in more feeds than have had these unread flips.
what i’m finding today now is that if i don’t read everything in a category (read, say, two things of five), click to a different category and come back, the posts i read, that were marked read before i clicked away, are now unread again, though the counter on the left is correct.
In case the universe is just waiting for someone to jinx things… Going a week without seeing this occur has me feeling confident that Newsblur is now 100% bug free.
Seriously though this issue seems to be resolved. Thanks for diligently working on and addressing this @samuelclay !