Hacker News 50 feed ( http://feeds.feedburner.com/hacker-ne… ) pushes a huge amount of stories into Newsblur, none of them properly linked or with content. The Feedparses seems to have a problem there.
For me the links to the _articles_ are working, but the links to the _comments_ are broken, (I’m getting links like: https://news.ycombinator.com///news.y… )
Comments links are broken for me too. Not sure if this the parsing code, or HN’s rss feed changes.
HN’s feed changed. I verified this today. Unfortunately, there’s not much I can do. They are now publishing base urls and full urls with an unusual encoding. So they are getting clobbered because the encoding effectively hides the fact that it’s a fully qualified domain name (fqdn).
I just checked Google reader. Not sure how but reader seems to parse it alright, so there might be something you can do (or maybe not, I’m totally ignorant to newsblur codebase to make any judgement). I can try to contact HN admin and convince them to update their feed.
It is still a valid RSS feed according to http://feedvalidator.org/check.cgi?ur…
Surely there is a way to parse URLs that use / to encode forward slash?
Samuel, I’m piggybacking on this thread because I noticed something in an established previously stable feed (Hacker News 50) and wonder could it be the same problem? Symptom for me is, I get titles in feed but no content and when I expand to full page, it simply re-draws the NewsBlur page for the group I was looking at. I didn’t notice if the item is marked as ‘read’ before the page is re-drawn, but if I can give you more detail write me offline. Thanks. jmb
Good news, this has finally been fixed. Took a lot of back and forth with the developer behind Hacker News, as well as some work on the feedparser library. All good now.
Awesome, thanks very much!
I’m sorry to report it looks like this issue has returned. I’m getting the same problem in comments link pointing to:
They’re on it. Developer says “next reboot” but who knows when that is.