RSS feeds - The WWW

Users browsing this thread: 1 Guest(s)
xenone
Members
I've been seeing a lot of people talking recently about using rss instead of social media sites. What are everybody's thoughts on this and any suggestions on rss readers for people moving in the direction?
venam
Administrators
My answer might be biased because I've stayed away from "popular" social media hubs for the past 9-10 years, but I don't think I regret it.

There's one thing I love about the web, and it's right there in the name: The web part, how you can jump from one place to another and discover things along the way. I had a bunch of posts, in the currently discontinued newsletter, about how the WWW is awesome, and also posted a discussion here on the forums about the state of the web.
Recently, I also saw multiple posts regarding putting more emphasis on RSS, blogging, and self-hosting. Basically, trying to put back some joy into the web. We had previously tried to gather all nixers members blog feeds but without too much luck, we're trying again right now.

I think that the thread joining all these topics is that people's web consumption has become repetitive and centralized, or to say it bluntly: they're bored. I've had a friend some days ago ask me about my web browsing habits, asking if they could change theirs. They felt stuck in a loop going from link aggregation sites, to Facebook, to Reddit, to Twitter, and YouTube. There's much more to the web than a few websites. Maybe the fact that everyone is stuck at home has made some people realize this.
jkl
Long time nixers
RSS/Atom solves the problem of having an unfiltered mess of random people’s thoughts in your inbox if you want to stay informed. There is nothing that provides a similar alternative.

I wonder why certain social networks implement their own syndication logic for user posts. They can never beat the flexibility of customizable XML feeds.

Plus, there are services like my RSS parser and the RSS Bridge to provide feeds for anything that looks like a website. There is no sane reason to use more than your RSS reader to read, filter and understand what is happening in the world. Plus, they won’t disappear soon, so you don’t have to frantically press F5 every day.

I admit that Google’s Reader made me an RSS user. Even years after its demise, I won’t leave anymore.
ckester
Members
I've never been a fan of social media hubs or other centralized approaches. Even what you call an attempt to "gather all nixers member blog feeds" leaves me cold.

In my opinion, the best place to aggregate rss feeds is on my own local machine. For many years now I've been using Adam Sampson's "rawdog" aggregator to build a set of river-of-news style webpages for various categories and serve them up via localhost. (Lately I've been looking at sfeed as a C-based alternative to rawdog's Python, but it's an idle interest and I haven't been motivated to actually implement it yet. What I have works well enough.)

For discovering interesting new stuff, it suffices if the blogs I read include an old-fashioned blogroll. It isn't necessary for them to repeat or summarize content from the other blogs they recommend. I.e., I'm building my own river of news and don't need you to do it for me on your site. Just give me your content. *

Other leads come as a result of websearches via DuckDuckGo. If a search takes me to a site that looks interesting and I see it has a feed, into the rawdog config it goes.

Years ago I would occasionally use StumbleUpon to find new stuff. But that was a long time ago and I don't know if it even still exists.

* Here's an example of the kind of thing that often results when someone decides to "curate" the news for us: https://devurls.com. Very pretty, but mostly useless. Only one or two sites listed that I care about, and a lot of relatively dormant feeds.
vain
Long time nixers
Big fan of feeds.

Feeds are a generic format, so you don't need that one client ("app") for this site and another client for another site and so on. Pick your reader or write your own. Once. Then fill it with content to watch.

It's decentralized, you can "follow" pretty much anything. (Social media promoted "you can follow everyone on our social network!" as a unique feature, but it already existed ...)

I consume pretty much everything through feeds, including aggregators like reddit or lobsters. Why would I want to open their slow web site or even be forced to log in?

It's a shame, though, that major browsers removed virtually every feature in the area of feeds. There used to be a little icon to indicate when a feed is available. Not even that exists anymore.

I, too, wonder which feed readers you guys use. There don't seem to be that many great tools available. Me, personally, I use a custom script based on feedparser (https://pypi.org/project/feedparser/), which grabs feeds and sends new items as an e-mail. But I have no idea how to "sell" the idea of feeds to "normal" people, because what are they supposed to use?
venam
Administrators
(21-06-2020, 03:16 PM)ckester Wrote: I've never been a fan of social media hubs or other centralized approaches. Even what you call an attempt to "gather all nixers member blog feeds" leaves me cold.

It isn't meant to be centralized in any way, just as a way to discover other people's feed. I won't personally use the list as is either, I'll register to the RSS feeds of the ones that interest me. Also, the list isn't meant to give rise to a website, it's simply a text file, nothing more nothing less.

(21-06-2020, 03:16 PM)ckester Wrote: For discovering interesting new stuff, it suffices if the blogs I read include an old-fashioned blogroll.
That's also what I do most of the time. Take the above list of nixers blogs feed as a blogroll on its own.
ckester
Members
(21-06-2020, 03:19 PM)vain Wrote: I, too, wonder which feed readers you guys use. There don't seem to be that many great tools available. Me, personally, I use a custom script based on feedparser (https://pypi.org/project/feedparser/), which grabs feeds and sends new items as an e-mail. But I have no idea how to "sell" the idea of feeds to "normal" people, because what are they supposed to use?

As I described above, I use a separate tool to aggregate the feeds I follow and a web browser to read the resulting "river of news". I looked at tools that combine the aggregating and reading functions (snownews, newsbeuter, canto, etc.) but found myself frequently needing to launch my webbrowser to follow a link anyway.

So I eventually decided that the reading function belongs in the web browser to begin with, because after all, what I'm reading is web content.

But keeping the aggregating function separate makes it easier to run via cron, among other things.
jkl
Long time nixers
After a short period with Feedly and a few attempts to make Tiny Tiny RSS an acceptable alternative, I started to use Newsblur (over its own web client, over Fiery Feeds on iOS, over elfeed inside Emacs) until I finally get my rssfs working on Windows. :)

Honestly, Newsblur, even in its paid hosted version, is really strong and it’s Open Source if you care. And I prefer to have my feeds synchronized between my computers all the time.
vain
Long time nixers
(21-06-2020, 03:48 PM)jkl Wrote: Honestly, Newsblur, even in its paid hosted version, is really strong and it’s Open Source if you care.

Hmm, yeah, SaaS for feeds. It’s tempting to recommend one of those. You know, something you could recommend to your aunt, something that’s easy to use and automatically “synchronizes” across devices.

I was naïvely hoping to get recommendations along the lines of Liferea – but, yes, SaaS is probably what normal users want. (Maybe even that is too much work. I have no idea. I live in a different world. :-))


(21-06-2020, 03:35 PM)ckester Wrote: So I eventually decided that the reading function belongs in the web browser to begin with, because after all, what I'm reading is web content.

I’m struggling with this. On one hand, I love to read the feeds as they are. But that only works for some of them, some don’t even include the full article. And as you pointed out, following links is virtually broken.

I also recently discovered that my own feeds are broken for many feed readers. The feeds themselves are fine and according to spec, but the readers don’t care about the specs. Bam, no images show up. How annoying.
ckester
Members
I'm not sure I understand what you mean by reading the feeds as they are, but rawdog doesn't do anything to the feed content except wrap it up in div tags for inclusion in a local webpage.

At one time I had made a modified version of it that stripped out all but the first few lines of each feed entry, but I lost that when I made a mistake backing up my files before a system wipe and fresh install a few years ago. So if a feed includes the full article, that's what I get in the resulting page. Or if the feed has only a summary or a title for each entry, that's all I get for that one.

For screenshots and more info re the unfortunately-named rawdog, see https://offog.org/code/rawdog/.

But I take jkl's point about synchronization. I don't do much mobile computing myself, so I get enough "synchronization" from making one of my home machines a webserver for those rawdog-generated pages. But I can see why some of you might prefer a cloud-based solution.
vain
Long time nixers
(23-06-2020, 02:23 PM)ckester Wrote: rawdog doesn't do anything to the feed content except wrap it up in div tags for inclusion in a local webpage.

Oh, I see. I misunderstood then. I was assuming it only showed the title and maybe a few lines, so you'd *always* have to go visit the original page.
Saos
Long time nixers
Love them, will use them whenever possible as an alternative to following people on YouTube/Twitter/podcasting platforms. Newsboat (and the built-in podboat) are terminal-based and easy to use once you have the XML.
twee
Members
I love feeds, I access them via email using rss2email at the back. My only problem with the program is that it doesn't do anything about feeds that only provide summaries or links; my hackish solution is to not follow those websites. Previously I used Newsboat, which had an option to automatically download the page of feeds that provided summaries, not sure how it guessed those though.

I also get things like some Youtube channels via RSS (I don't have a Google account), and stick the URLs the feed provides through a script to queue them for downloading at some point.

Like a few others in this thread, I like blogrolls as ways of discovering sites. I'd be interested in a piece of software which automatically converts an OPML file into a HTML page, or something.

(23-06-2020, 01:08 PM)vain Wrote: I was naïvely hoping to get recommendations along the lines of Liferea – but, yes, SaaS is probably what normal users want. (Maybe even that is too much work. I have no idea. I live in a different world. :-))

There's a world out there. I started out with QuiteRSS, and have also enjoyed Rawdog, Elfeed, and Gnus (with Gwene, or another similar thing whose name has entirely escaped me). They're all local, and they all provide a slightly different experience, depending on what tickles you.
bouncepaw
Members
Been using RSS for a couple of years. I have recently moved from Feedly to a MiniFlux instance a friend hosts for us both.

RSS makes me happy. It is easy to read good content from many places without actually visiting them. Best of all, I don't have to go LiveJournal (it barely loads on my 4 GB RAM phone!).

Though I still consume most of content through social media, mostly Telegram (awesome in all ways) and Twitter (cool idea, bad implementation).

It's sad how hard it is access content on social media nowadays. For example, Reddit. I have no desire to watch your READ IN OUR APP popup twice every time I click anything. I even have it installed — awful.

RSS is peaceful. I recommend everyone to give it a try.
acg
Members
Recently I've changed my social media behavior, where I mostly used Twitter. Unfollowed many people in hopes to curate the content, but even those I know post quality content (which I have separated in Lists) tend to post other stuff from time to time and ends up contaminating my feed.

After which I tried doing the same with mastodon, without being really a solution. Content is still better but it also depends on the instance.

Finally (starting June 2020) I opened an old Newsboat `urls` file I had and started from there, enjoyed the experience much more but I usually read while commuting and pulling out my laptop in public transportation is a no-go. So as mentioned by @bouncepaw, I also set up miniflux, from the web it's... decent. Although the android apps I've tried using Fever API haven't been maintained for some time or are still buggy (can't even mark as read).

I'm still looking for alternatives; I'll keep you updated since miniflux feels promising and is just a binary (with postgresql being required) one can easily host.
twee
Members
(01-07-2020, 09:11 AM)acg Wrote: I'm still looking for alternatives

What features are you looking for? Do you like the ability to sort by feed? As mentioned by a few people, rawdog is great for river-of-news feeds; just stick it to cron every hour or so and output to a web page you can access. It's written in Python and so obviously isn't just a binary. If that's a dealbreaker to you, you might be interested in hacking something similar together with sfeed. There are probably other things that suit you but I don't know of them.
acg
Members
(01-07-2020, 09:40 AM)twee Wrote: As mentioned by a few people, rawdog is great for river-of-news feeds; just stick it to cron every hour or so and output to a web page you can access. It's written in Python and so obviously isn't just a binary.

I only looked at rawdog for a moment. As a river-of-news looks great, I already use a different approach for saving links so that's basically what I need. The whole "just a binary" thing is mostly to reduce dependencies and keep my server clean, but I could also benefit from rawdog's static and no database.

Should be easy to get that running. From what I get this is only a webpage with the aggregated feeds, so no way to sync for offline reading, right? Not crucial.

EDIT: rawdog doesn't support Python 3.*, which keeps me away from trying it.
ckester
Members
Yeah, rawdog is still based on Python 2.7 and there doesn't seem to be any work on migrating it to 3. That's why I have had moving to sfeed in mind as a backburner project for quite a while now. Maybe I should get to work on that.

sfeed lends itself to a pipes-and-filters approach that I find congenial. I'm not sure I like the way sfeed_update "merges" feeds, but I haven't looked too deeply at that yet. I might end up using only the XML parser from sfeed and writing my own code to build the river-of-news pages.

A more serious concern with sfeed_update is that it doesn't seem to check whether a feed has actually been updated since the last time it was fetched. It looks like it uses curl(1) to download the whole feed each and every time. Not cool, generating unnecessary traffic like that. As I mentioned before, some sites include the entire article content in their feed.

(I need to go back and look at how rawdog is dealing with unchanged feeds: is it just a matter of making a conditional GET with If-Modified-Since? I seem to recall that the pertinent date info is stored in the rawdog db for each feed...and I don't see anything like that in the sfeedrc or generated files. But this is just a first impression and I could be wrong.)

UPDATE: yes, looking at the fetch() function in rawdog.py and the parse.args there, I see that rawdog uses the etag and/or last-modified headers feature of feedparser to save bandwidth if and when the publisher supports them. With some changes to the sfeedrc file or creation of a db similar to rawdog's, it shouldn't too hard to add the same feature to sfeed_update's use of curl(1). So there's item #1 on the TODO list. ;)
ckester
Members
This got posted recently and was updated today. It might interest readers of this thread:
https://codemadness.org/sfeed_curses-ui.html
Dworin
Members
I've been using newsboat for a while, just for the general think I know I want to browse. If I want to pursue something, I'll switch to browser.
jkl
Long time nixers
Next step in developing the RSS file system: I’ll add a cache. Hopefully, next week!

edit: Done!
fre d die
Members
(01-07-2020, 09:25 PM)ckester Wrote: This got posted recently and was updated today. It might interest readers of this thread:
https://codemadness.org/sfeed_curses-ui.html
i've recently been looking into making an rss feed, not sure if im doing something wrong but all sfeed_update seemed to do was curl the links and the create an empty feeds.new file. and emerging newsboat results in some sort of compilation error...