Tech alone cannot defeat fake news

In 2016, “fake news” entered our collective lexicon, bringing an instantly recognizable term to something that has existed for ages.

Hoax articles are nothing new. Publications like the Weekly World News and the Daily Sport have long filled their pages with fabulously implausible stories, like “Kim Jong Un is a space alien.” But this was something different.

On both sides of the political divide, the 2016 election was filled with hyper-partisan faux-reporting. Pope Francis, for example, purportedly endorsed both Donald Trump and Bernie Sanders.

Many have attributed the deluge of fake news to the stunning victory of Donald Trump over his Democratic opponent. I’m not sure how much stock you could put in that, but for those living in titanium-plated filter bubbles, it certainly reinforced their support for him.

Many technologists have rightly recognized that fake news is bad for our democracy. In response, they’ve created their own tools to fight it.

You can’t dispute their intentions. I think fake news is something most of us would like to see disappear for good. But is tech able to fix what amounts to a flaw in human nature?

If you talk about the role of fake news in the 2016 election, you can’t help but talk about Facebook. Many of the false, hyper-partisan news stories found their audiences through the social giant, and even outperformed stories from legitimate news outlets. These stories were so viral, then-president Barack Obama implored Zuckerberg to take action.

At first, Mark Zuckerberg was openly disdainful about the possibility fake news on Facebook played a meaningful role in the election. He later admitted he misunderstood the impact of it, and Facebook resolved to take action.

In December, the site started to mark fake news stories with a striking red flag, and pointed readers to fact-checkers like Snopes, which disputed the factual accuracy of the story.

It was a simple fix. It didn’t work. In fact, it had the opposite effect. In our polarized political landscape, it actually “entrenched deeply held beliefs.”

“Academic research on correcting misinformation has shown that putting a strong image, like a red flag, next to an article may actually entrench deeply held beliefs – the opposite effect to what we intended,” wrote Facebook product manager Tessa Lyons.

Instead, Facebook is showing fact-checked “related articles” next to these stories. This doesn’t merely undermine the…

Read the full article from the Source…

Leave a Reply

Your email address will not be published. Required fields are marked *