Items not staying marked as read

This may be a problem related to too many new users, etc, etc, but some of my feeds that are more active than others have had their items marked as un-read many times. For example, I’m currently looking at a feed where the oldest “new” item is from yesterday at 4:39am. I’ve marked this feed as read MANY times since then, but they keep coming back!

95 Likes

Same thing happens here. Regardless of reading to the end of a feed, marking the feed as read or the whole folder with multiple feeds as read, some come back with the same amount of unread as when I marked it read.

Yeah, this seems to happen. It was really bad during The Announcement, but I still get some repeated unread items at the oldest end of my feed.

Is this just related to the scaling issue Newsblur is having? Because it’s a pretty severe problem – I’ll see the same story presented as new eight or nine times.

Thanks Cody, I’ll try that. And I agree, I think he’s doing an excellent job considering what he’s fighting against – which is why I’ll give him both the benefit of the doubt and my money!

I just wanted to make sure that the issue we were talking about was a result of the recent server load problems, and not a “feature” -

Same problem here, seems to happen a lot on xkcd and Explosm:
http://www.xkcd.com/rss.xml
http://feeds.feedburner.com/Explosm

Both sites showed me the same articles over and over again, especially on the Android app.

Happened to me as well…

And sometimes it shows those read items as well again… even though I’ve specified only unread and they display as read… yet they show up.

Perhaps it is because of a item too far back to display, yet keeps the feed “current”?

Happening more. “the official google blog” is the latest. Is it more common with the most popular blogs, perhaps?

The reason this is happening is because the marking as read times out. Do me a favor and tell me the status code you get when this happens. If it’s a 502, then it’s timing out. If it’s a 500, I can look it up.

Just tried, all the /mark_story_as_read posts were 200 codes but it pops up the next time. Here’s an example of the post and misc info you might enjoy:

story_id=tag%3Ablogger.com%2C1999%3Ablog-10861780.post-3498969730356539289&feed_id=766

X-server:app23

Happens to me as well. I just upgraded to premium to get the continuous stream. Eventually it will say newsblur is down right now, try again later, and when I refresh I’m back to where I started or maybe a few less. I’ve tried manually marking as read and automatic but still same problem.

Great site. Had no idea it was a “labor of love” since it looks so professional, etc. Hope things will get better once the number of new users levels off…

Also just noticed if I purposely stop, quit my browser and relaunch, it doesn’t keep the items marked as read. Gonna go back to google reader and try again in a few days.

Someone on app.net told me that Dev was just a different skin on the front end and that using dev made no difference to speed or other things. I doubt that, myself, but he seemed sure.

Is it a different server with different functionalities?

Hia,

I’ve had a quick look, and I’m getting 200’s back from mark_story_as_read, but with response times of 8.03, 9.55, 6.21 11.93 seconds etc.

Response content seems fine too:
{“code”: 0, “authenticated”: true, “payload”: [“http://www.explosm.net/#20130312”], “result”: “ok”}

Interestingly, replying the post seems to also return a 200… not sure if that’e meant to happen or not.

Hop ethis helps!

The Problem is that if you rush through your Headlines and have such “mark_story_as_read” requests that takes too long, it seems that the front-end forgets to post the mark_as_read requests to the server.

Simple test: open up a feed with 10 unread story’s and go fast with the next unread button and after that count the amount of post requests. You will be missing a couple of them especially if one of the first needs a very long time to finish.

So the not updated unread count looks like a front-end problem that only occurs if the back-end is not fast enough to handle the request.

I don’t think this is entirely accurate. I’ve seen all the requests go through (return successfully), yet if I come back to Newsblur 15 minutes later those stories will be marked as unread.

I’m wondering if it’s the /everything collection that is the problem.

It looks like that there differences, normal site uses gunicorn dev site uses nginx. Perhaps this hole thing could be speeded up without the simpsons quotes on every response.

For me what seems to happen is I eventually get a message saying “Newsblur is down, try again later” on the bottom pane. The last however many stories that were already loaded in the browser don’t get marked as read on the backend even though the unread count does decrease on the frontend.

I have the same problem on different feeds, so it is not the /everything collection. I think lockhead883 hits the nail on its head: The backend simply isn’t fast enough to handle all the requests, hence there should be some kind of queue of some sort. Google must handle it in some way as well, never had that problem there that items I read pop back up at a later time.

My experience (using LifeHacker as an example) is that when I have unread items in the feed, it will show me all items in the feed (read or unread). When I have no unread items, the feed does not show up on my unread feeds list.

However, when a new unread item arrives on LifeHacker, all the unread items come back. They are treated as read (rereading them does not decrease my unread count), and I don’t have to reread the read ones to clear out the unread ones.

Feedblur only loads about 10 stories at a time when I look at a feed, so when I go to the LifeHacker feed it insists on loading the scores of already-read stories, in batches of 10, until it gets all the stories read. At which point, the new, unread stories, are at the bottom of the list. I haven’t tried Insta-Fetching to get around the batchiness. I’ll try that next time LifeHacker (or Ars, xkcd, etc) updates.

If I look at all my unread items, the old items gum up the works horribly. The working solution I have is to go specifically to the feeds with this problem, like LifeHacker, and clear them out first, then look at my unread items. Sometimes, of course, one of them updates while I’m trying to clear out the mess, and I’ve got to start all over again.

Like Bryn Hughes, the “load 10 stories at a time” bit sometimes times out, with a “Newsblur is down” message (this afternoon, of course, was “Newsblur is in maintenance mode”, but that’s a different issue). Once that happens, I have to start all over again.