This feed: http://www.vgcats.com/vgcats.rdf.xml just exploded and started spitting out dozens and dozens of the same item as though they were unread. I checked, and this isn’t occurring in Google Reader, so it probably isn’t a problem with the feed itself.
I don’t know if this is why it doesn’t work in Newsblur, but their RSS feed is invalid. If you look at the xml document, they don’t close the
<link>[http://www.vgcats.com/](http://www.vgcats.com/) [and then a little down...] <link>[http://www.vgcats.com/comics/?strip\_i...](http://www.vgcats.com/comics/?strip_id=320)
Submitted the same issue last night:
And though their RSS feed appears to be on the wonk, NB shouldn’t behave this badly in response.
It’s their feed. Oddly, what you’re seeing is one new story for every feed fetch that NewsBlur does. Their feed is broken and they need to repair it. But sometimes, in rare circumstances, this doesn’t break the somewhat lenient feed parser on NewsBlur’s end. So all of the stories get scrunched down into one story, but it’s a very broken story. I wish this feed went to an exception, but it’s behaving oddly enough that one story can at leas the parsed out of it.
The real reason is that there is no tag, so only the first story is getting captured. And because it’s just weird enough, a new single story is added every single fetch. They really should fix it.
Got it. Lazy old Scott Ramsoomair, hardly ever updating and using a broken feed…