As the database analysts say, Garbage in, Garbage out.
I subscribed eleven months ago to the Ground News aggregator service after a personal recommendation from someone who did not (at the time) disclose their interest. I did still think it was worth a look, so I signed up.
I have used the service almost daily on a good selection of technology platforms and I feel comfortable making a recommendation. In way of disclosure, I made a calendar appointment for myself to end my subscription before it auto renewed several months ago. It wasn’t long after I started using the service I started to get that familiar feeling I was being force fed junk food.
The service is predominantly AI driven, hoovering up news stories before making an AI judgement-call on the bias and trustworthiness of the various news sources. It appears Ground News themselves don’t trust their AI’s call on the bias ratings, it’s user editable.
Cognitive bias is extremely difficult to break free from as a human, I can’t imagine how AI achieves this without true abstract thought capabilities. Bias is after-all designed to make decision-making quicker and more efficient, I would think AI would be more victim to it.
From my perspective, about the only value Ground News adds is the same aggregation service their counterparts offer but with less clutter. They suffer the same problems their competitors do; including in their feeds, expired, paywalled and advertorial news articles that waste a reader’s time. When will news services and podcasters learn real humans don’t have limitless time and money for subscriptions. Some form of news aggregation and less intrusive advertising is the only future. Surely the crawler bots could check if an article is available without a paid subscription. It appears they are so focussed on sucking up everything, they have sucked up the living room rug up by accident.
There is an opportunity here for someone to get this right. Instead of bypassing the best computing power on the planet (the human brain), they should be assisting it. Instead of allowing users to set their own bias ratings permanently, perhaps news aggregators could provide some telling statistics on other readers age, regions and education levels. It’s not a stretch to see the service’s AI has become slanted on a diet of lefty journalism. Most journalism is.
I am yet to be impressed by AI when it is used in roles that require abstract thought. Yes, I’ve seen some hilariously weird AI generated YouTube videos as well as correctly structured poetry. But unless it’s oddly weird, it’s boring. The work itself is never thought provoking. Perhaps this is acceptable to anyone who has become accustomed to the politically correct garbage Disney and Netflix have been churning out.
It’s important to stress that AI, in this instance, is handling the information we consume to presumably make decisions at the ballot box. It might be more important to get this wrong on our own, rather than wrong with help from AI. AI can, and will be manipulated. Whether AI is supposed to be moral is a whole different conversation.
AI is nothing more than a tool following algorithms that are processing data scraped of the internet. To have more faith in AI over our own kind sounds like some existential religious zealotry to me. I would not be the first to draw a line between zealotry and many popular social movements. I written about it previously. here
It appears some of us have decided that the world would be a better place without humans.