Automated Workaround for Feeds Not Updating

Just to give this some additional visibility and to update as additional updates are made while we wait for the feeds not updating issue to be fixed:

Here is a workaround that worked for me

  1. Open DevTools for your browser (generally F12 or Ctrl+Shift+i or Tools -> Developer Tools)
  2. Paste the following into the Console
    var fn = function (i) { if (i \>= $('.feed[data-id]').length) return; $('.feed[data-id]:eq(' + i + ')').click(); setTimeout(function () { fn(i + 1); }, 1000) }
    and press enter
  3. Type fn(0) and press enter
  4. Wait

That script goes through each feed and clicks it, which should at least automate the process of clicking into the feeds and refetching the stories. Don’t know if the feeds will auto update after that, but it’s faster than manually.

Bookmark
Get Statisfaction won’t let me post the actual link since it contains script, so go here: http://jsfiddle.net/u4b39nyu/ and drag the “Update NewsBlur” link to your bookmark bar.

You can then click it when on NewsBlur to update the feed without having to open the DevTools

1 Like

Andrew, do you have the URL that gets submitted for each feed? I can sniff it but I assume you’ve already done so.

Devtools route didn’t work (Firefox) bookmark does though (in combination with a page refresh). Awesome. I didn’t even consider you could do that… Heh

Did you try it with a shorter delay between clicks?

It’s driven by IDs from what I can see. The calls are GET requests to /reader/feed/:id

The ID is in the HTML element’s data attribute (they look like ```

  
 I expect it could be possible to fetch all of those IDs and make a GET request for each individual one, which might actually be faster if it does in fact trigger the refresh of that feed. I'll look into it.

yeah, I’m going to try doing that with curl/python.

I have tried as low as 250, which seems to work ok

great. a GET works with to each feed id. Proved with a single-subscriber feed that always has new posts (Stellar). I’ll start looping through them. I don’t know how high they go- I just tried with one near the top of 6 digits.

If you export your feeds into XML and then reload them it fixes it.

Ahh, you’re trying to run through everyone’s to force a refresh. Nice idea! I thought they would be tied to user accounts, but it doesn’t look like that’s the case actually.

ah this is awesome. thanks.

yeah, it’s working but really slow. I’ll need to do it async/threaded to make it decently fast. I’ve only done 5000 in 20 minutes.

Sweet! Thanks!

I tried the script, and it went through all the feeds (including saved ones) but then hid them all so I still couldn’t see them. I’ll try the XML thing next.

Works like a charm. Thanks!

here’s my script. it’s really fast now, already through the first 20k feeds. pull requests welcome.
https://github.com/tedder/newsblur-hy…

Ahah! Wait a minute…now they’re showing! it only hid them for a few minutes. Thanks!

Nice! I put one together in C# as well https://gist.github.com/andrewburgess…

there are at least a 1.6MM feeds (not 100k), btw. Also, loving the spike on the newsblur “sites loaded” stat :slight_smile:

Hahaha definitely. How about I start at the end and you keep going at the beginning. We’ll get everyone’s feeds updated then!

sure- start at 1mm perhaps? are you on Twitter? we can post status there without spamming people so terribly.