Essentially, it's becoming more and more work to separate signal from noise, and it never seems that everything you want to keep track of has a feed. I can't imagine what it must be like if your job is to parse news for a living. Imagine being an analyst for a bank and having to wade through the cruft you'd get in a news reader every day, not to mention the monthly publications, etc.
What I think is going to happen is that both browsers and aggregators services in the cloud are going to start enabling a lot more logic and customization. We see the start of it now with Grease Monkey scripts and browser plugins and extensions, but I think a next level of user-friendly artificial intelligence is needed. Applications that parses web pages, gathers content, displays it intelligently and economically and does it all without magic (which is generally always wrong), but as directed by your specific choices of what you think is good and bad. ScraperWiki is the first step towards this sort of thing, but really it's only just the beginning.
Anyways, a few years ago I decided that the mobile web as a separate entity was a dead end because of the quickly improving mobile browsers and it turns out I was pretty spot on. It never dawned on me that the same logic could be applied to web feeds because of things like quickly improving server-side parsers and bad user experiences, but now I'm seeing that it is. I personally still wouldn't launch a new site today without having a decent feed, but I bet it'll be a short time before I don't worry about it, and I bet there's a lot of other web developers that feel that way already.