Quantcast
Facebook is making us more narrow-minded, study finds – Metro US

Facebook is making us more narrow-minded, study finds

Facebook is making us more narrow-minded, study finds
Flickr / Esther Vargas

Nearly two-thirds of Americans get their news from Facebook — whether it’s on their news feed, a friend’s timeline or the social networking site’s controversial Trending Topics box. But while the site is helping us stay connected and informed, Facebook has also been found to make its users narrow-minded.

So says a study published in the Proceedings of the National Academy of Sciences, which found that Facebook creates an “echo chamber” —a place where like-minded people share biased views, conspiracy theories, scientific studies or other polarizing topics, and their beliefs go unchallenged.

This means any belief users have are just repeated back to them. Even if they’re bunk.

Academics abroad and from Boston University analyzed Facebook data about the topics people talked about in 2010 and 2014, and discovered that once a piece of information was accepted as “fact” —whether true oralternative –was disseminated quickly through the user’s online community, seen by friends and fellow members of groups, despite having no basis or proof.

Such a phenomenon was explosive in the 2016 election when fake, unverified or exaggerated news articles were shared among users, leading to massive misinformation.

In fact, a BuzzFeed analysis of the spread of news on Facebook found that the top fake news stories about the election outperformed top election stories from nearly 20 major news outlets combined.

The social networking site’s algorithms, especially in the Trending Topics area, promoted untrue stories based on engagement, which in turn drove engagement rates higher for hyperpartisan political pages.

“Whether a claim (either substantiated or not) is accepted by an individual is strongly influenced by social norms and by the claim’s coherence with the individual belief system – i.e., confirmation bias,” the study’s authors write.

The issue gets worse when other users attempt to debunk. The World Economic Forum, which has called misinformation online as one of the greatest threats to society, found that people exposed to “debunking” are more likely to defend their bias.

In other words, presenting misinformation with fact encourages the spread of misinformation.

There are several solutions in development to combat the spread of false news, but they’re tricky, too. The study highlighted Google’s attempts to develop a trustworthiness score when ranking search results, and Facebook has proposed a method that allows users to determine when information is false and flag it, to reverse the site’s algorithm.

But in these echo chambers, leaving the burden of proof on the user can actually lead to incorrectly flagged news stories, magnifying the effect of fake news.