DealExtreme's feed *still* broken and needs an urgent action!

DX.com ́s new arrival feed (http://dx.com/feed-new-arrivals) is seriously broken.
At most times it does not update properly. (New items is in the feed xml, but nothing is visible in NewsBlur. You have previously blamed timeout’s, and there is plenty of timeouts in the statistics. (But just feed fetches, not page-fetches, what is the difference between them?)
However, it has never taken me more than 4-6 seconds to fetch the entire file (it’s usually about 1-1.5Mb) when I try to fetch it with wget or any other browser.
But the problems don’t stop at the timeouts. When the feed once in a blue moon *is* fetched (latest as of now was 2013-07-02 01:27:33) Newsblur allways stop at 50 posts. But today for example, the feed contains 300+ items. (It is usually 3-500 new items, updated a bunch of times a week.)
Until now I have migrated everything beside my dx.com feed to Newsblur, but I have still been forced to use Reader to get this specific feed to work. And now after the sunset I use Feedly for this feed, since they don’t have any issues at all with it.
It would be nice to use the reader I pay for and actually like better, when it works.
We are 311 subscribers to this feed and we really deserves something better than your response 6 months ago on the issue: “This is an unfortunate feed.”

1 Like

Umm running the feed through a validator it looks like the issue is at the DX end as the feed isn’t valid

http://validator.w3.org/feed/check.cg…

Sure, it has a few minor errors. (length and a malformed url for the enclosure-image).
However, all other readers I have tried have no problems coping with these issues.
And since the problem has been constant for 6 months, with Newsblur beeing able to parse the feed a few times a week this is most likely not the cause.

Today the feed was once again parsed.
But as usuall only 50 of the 302 items is availiable.
319 subscribers to a broken feed…

For the past few days I’ve been testing NewsBlur, Feedly & g2reader side by side and testing DealExtreme feed (among other things too) and this max 50 items problem with NewsBlur.

NewsBlur: DX feed returns only 50 items. (today about 300 new items in the actual RSS feed)

g2reader: seems to fetch all ~300 items but all are marked as “read” automatically because the site puts the feed in to “passive mode” because it contains hundreds of new items every day.

Feedly: all ~300 items are fetched AND all are marked as unread!!!

So long NewsBlur, for now… Feedly is a great service (it doesn’t have many other restrictions that NewsBlur has). It’s not perfect, it has some minor things to fix/develop, but it really seems to be the only choice to replace Google Reader (and I’ve tested over 10 alternatives).

Good luck NewsBlur, keep developing your service, currently it’s “number 2” in my list, and I might check back someday. Thanks for the hard work.

Samuel, Can you please respond to this or one of the other two tickets that report the issue with DX.com only recieving 50 items each time the feed is updated with 300+ items?
As a paying member I must say that the lack of response on this is really bad.

Feedly is the biggest, and is backed by venture capital. hopefully someday they’ll figure out how to monetize that. i’m sure they’ve got it figured out better than Google did. right?

Newsblur is one guy. talented, sure, but he has the same 24 hours in a day we all have. think of the influx of users over the last couple weeks, plus the uptick in bug reports for all the different platforms he has to maintain support for (not to mention APIs for developers!). i propose we cut him a bit of slack.

and in the meantime, if it really bugs you, ask him for a refund and switch to Feedly. i mean, they finally added a copy-and-paste-into-a-file OPML export! they’ve got everything!

I responded in the other thread: https://getsatisfaction.com/newsblur/…

Here’s my answer:

This is an unfortunate feed. The reason it’s doing this is because it’s timing out. I have a 20 second timer and this is hitting it nearly every time.

Also, I noticed that the RSS feed has 300 items in it, which is why it’s timing out. Parsing that many stories is taking > 20 seconds. I’ll look into figuring out how to better handle timeouts when the data is all there.

Knic Pfost needs to seriously think about what his post actually gave to this topic. Fanboyism & slander.