Benefits of owning a self-hosted NewsBlur?

and here are the haproxy logs now:

[NOTICE]   (1) : path to executable is /usr/local/sbin/haproxy
[WARNING]  (1) : config : parsing [/usr/local/etc/haproxy/haproxy.cfg:49] : a 'monitor fail' rule placed after a 'redirect' rule will still be processed before.
[WARNING]  (1) : config : parsing [/usr/local/etc/haproxy/haproxy.cfg:50] : a 'monitor fail' rule placed after a 'redirect' rule will still be processed before.
[WARNING]  (1) : config : backend 'node_socket' uses http-check rules without 'option httpchk', so the rules are ignored.
[WARNING]  (1) : config : backend 'node_favicon' uses http-check rules without 'option httpchk', so the rules are ignored.
[WARNING]  (1) : config : backend 'nginx' uses http-check rules without 'option httpchk', so the rules are ignored.
[NOTICE]   (1) : New worker (8) forked
[NOTICE]   (1) : Loading success.
[WARNING]  (8) : Health check for server node_socket/node_socket failed, reason: Layer4 connection problem, info: "Connection refused", check duration: 3ms, status: 0/2 DOWN.
[WARNING]  (8) : Server node_socket/node_socket is DOWN. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
[ALERT]    (8) : backend 'node_socket' has no server available!
[WARNING]  (8) : Health check for server node_favicon/node_favicon failed, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms, status: 0/2 DOWN.
[WARNING]  (8) : Server node_favicon/node_favicon is DOWN. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
[ALERT]    (8) : backend 'node_favicon' has no server available!
[WARNING]  (8) : Health check for server node_text/node_text failed, reason: Layer4 connection problem, info: "Connection refused", check duration: 2ms, status: 0/2 DOWN.
[WARNING]  (8) : Server node_text/node_text is DOWN. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
[ALERT]    (8) : backend 'node_text' has no server available!
[WARNING]  (8) : Health check for server node_page/node_page failed, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms, status: 0/2 DOWN.
[WARNING]  (8) : Server node_page/node_page is DOWN. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
[ALERT]    (8) : backend 'node_page' has no server available!
[WARNING]  (8) : Health check for server postgres/db_postgres succeeded, reason: Layer4 check passed, check duration: 1ms, status: 3/3 UP.
[WARNING]  (8) : Health check for server mongo/db_mongo succeeded, reason: Layer4 check passed, check duration: 0ms, status: 3/3 UP.
[WARNING]  (8) : Health check for server gunicorn/app_django failed, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms, status: 0/2 DOWN.
[WARNING]  (8) : Health check for server redis/db_redis succeeded, reason: Layer4 check passed, check duration: 1ms, status: 3/3 UP.
[WARNING]  (8) : Server gunicorn/app_django is DOWN. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
[ALERT]    (8) : backend 'gunicorn' has no server available!
[WARNING]  (8) : Health check for server redis/db_redis_pubsub succeeded, reason: Layer4 check passed, check duration: 1ms, status: 3/3 UP.
[WARNING]  (8) : Health check for server redis_story/db_redis_story succeeded, reason: Layer4 check passed, check duration: 0ms, status: 3/3 UP.
[WARNING]  (8) : Health check for server redis_sessions/db_redis_sessions succeeded, reason: Layer4 check passed, check duration: 0ms, status: 3/3 UP.
[WARNING]  (8) : Health check for server elasticsearch/db_elasticsearch failed, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms, status: 0/2 DOWN.
[WARNING]  (8) : Server elasticsearch/db_elasticsearch is DOWN. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
[ALERT]    (8) : backend 'elasticsearch' has no server available!
[WARNING]  (8) : Health check for server node_images/node_images succeeded, reason: Layer7 check passed, code: 200, check duration: 3318ms, status: 3/3 UP.
[WARNING]  (8) : Health check for server node_socket/node_socket succeeded, reason: Layer4 check passed, check duration: 0ms, status: 1/2 DOWN.
[WARNING]  (8) : Health check for server nginx/nginx succeeded, reason: Layer4 check passed, check duration: 1ms, status: 3/3 UP.
[WARNING]  (8) : Health check for server node_favicon/node_favicon succeeded, reason: Layer4 check passed, check duration: 0ms, status: 1/2 DOWN.
[WARNING]  (8) : Health check for server node_text/node_text succeeded, reason: Layer7 check passed, code: 200, check duration: 368ms, status: 1/2 DOWN.
[WARNING]  (8) : Health check for server node_page/node_page succeeded, reason: Layer7 check passed, code: 200, check duration: 163ms, status: 1/2 DOWN.
[WARNING]  (8) : Health check for server node_socket/node_socket succeeded, reason: Layer4 check passed, check duration: 0ms, status: 3/3 UP.
[WARNING]  (8) : Server node_socket/node_socket is UP. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
[WARNING]  (8) : Health check for server node_favicon/node_favicon succeeded, reason: Layer4 check passed, check duration: 0ms, status: 3/3 UP.
[WARNING]  (8) : Server node_favicon/node_favicon is UP. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
[WARNING]  (8) : Health check for server node_text/node_text succeeded, reason: Layer7 check passed, code: 200, check duration: 5ms, status: 3/3 UP.
[WARNING]  (8) : Server node_text/node_text is UP. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
[WARNING]  (8) : Health check for server node_page/node_page succeeded, reason: Layer7 check passed, code: 200, check duration: 6ms, status: 3/3 UP.
[WARNING]  (8) : Server node_page/node_page is UP. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
[WARNING]  (8) : Health check for server gunicorn/app_django succeeded, reason: Layer4 check passed, check duration: 0ms, status: 1/2 DOWN.
[WARNING]  (8) : Health check for server gunicorn/app_django succeeded, reason: Layer4 check passed, check duration: 1ms, status: 3/3 UP.
[WARNING]  (8) : Server gunicorn/app_django is UP. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
[WARNING]  (8) : Health check for server elasticsearch/db_elasticsearch succeeded, reason: Layer4 check passed, check duration: 1ms, status: 1/2 DOWN.
[WARNING]  (8) : Health check for server elasticsearch/db_elasticsearch succeeded, reason: Layer4 check passed, check duration: 0ms, status: 3/3 UP.
[WARNING]  (8) : Server elasticsearch/db_elasticsearch is UP. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.

That first screenshot is showing a broken NewsBlur missing a CSS file. What do you see in the web inspector when you look for broken links or CSS. Is it a 404 to a CSS file? What is that URL? Do you have DEBUG_ASSETS turned on or off in docker_local_settings.py or your own local_settings?

It is a 403 and URL is https://cloud.typography.com/6565292/711824/css/fonts.css

DEBUG_ASSETS is TRUE in docker_local_settings.py

Then everything else is working fine. Yeah I haven’t bothered to fix the fonts thing on my own installation either. It’s a bit hard with localhosts and not distributing unlimited keys.

So I’m not sure what’s broken now. Feel free to poke around the logs of different docker containers and restarting things to see if anything changes. I’ll need a bit more to go on, since it looks like everything is humming along.

It was the most difficult docker projects I encountered I must say…

I found I can actually signup (I found DB entries) and login. But after login the page was still redirected as show, with login 302 and a broken image (https://localhost/media/img/reader/default_profile_photo.png)

Are fixing these 2 elements can get me a working instance? Checked all the docker logs and everything run fine…

And the left-pane has gone after I try login at /account/login. Sounds like some UI elements are not loaded somehow. Is it the newsblur_web container controls the UI?

did you clone the github and then run make nb to get it work or is there a way to configure in a docker-compose.yml and just bring things up like most regular docker containers? I have an existing nginx proxy manager setup I was hoping to integrate this into to at least try it out extensively for six months (the free version doesn’t support enough feeds for me to just load the opml in from inoreader with 338 feeds and counting and I"m paid up till 2024 over there, so I’m not exactly averse to paying for this service, but I don’t exactly want to start paying for something I’m already paying for elsewhere when I haven’t determined if it is a big improvement or not. The two projects are strikingly different in what seems like opinionated ways but also this is the only other one I have looked at so far that is interesting enough to put the effort in to try it out and maybe start paying for it if self hosting goes well.)

I followed the github instructions where are clone and make nb, with some tweaks in some of the docker files needed to suit my environments. It is hard to make it works, at least for now. I may come to play again some times later when I have time.

did you have to edit the make file to change some “docker compose” to “docker-compose” to get it to get through the install? That was extremely weird to me because there are some correct docker-compose entries in there as well, no idea how that could happen in the code other than maliciously. My instance is now broken at the same place yours appears to be (weird website, unable to sign up). I spun up a new lxc to install it in so I didn’t mess with anything other than the very obvious docker compose errors in the make file.

Yes I did replacing. I do think this is the reason why my installation did not work but other works.

Are you running an older version of docker? I had to change all the docker-compose to docker compose because it would complain otherwise.

Docker version 20.10.12 currently, in openmediavault 5

Let’s move the discussion to a new ticket. If this ticket doesn’t describe the problem, then please make a new one, either here or on github:

Thanks Samuel for the follow-up. I could finally host my own instance after doing make collectstatic :+1:t2:

For all arm users’ reference, I am using yusukeito’s imageproxy

Great idea on using yusukeito/imageproxy:v0.11.2. I added that to docker-compose, so the next time you git pull && make nb you’ll have a full arm64 setup.

1 Like

But now I run into other situation, the pop up window show error when I click Find Twitter Friend button. I already filled the twitter token in the docker_local_setting.py.

What do you see when you type make log

Also, that implies settings.DEBUG is set to False, but it’s set to True in docker_local_settings.py. Why is your DEBUG set to False?

Just set it to True and know that I got 401.

I then correct the Key and Secret, with OAuth 2.0 I end up success loading to auth page, but after redirect this message returned:

Gotcha, from make log I know that I need to apply Elevated access in order to make it work. Just applied and now it is connected :ghost: