People Always Think They’re Right & Facebook Algorithms Keep it That Way

Most of us have the tendency to gravitate toward those who share beliefs, opinions, and viewpoints similar to our own, but when it comes to getting the latest news, we probably like to think we’re consuming unbiased information from well-informed sources and that we, as the receivers, are being opened-minded to a wide array of viewpoints. While this is a nice thought, in a time when social media is incredibly prevalent in the way we acquire information about, well, everything, researchers are uncovering that this may be far from the reality of how the average internet user is keeping abreast of world news.  

A study recently published in Proceedings of the National Academy of Science investigated the dissemination of content on Facebook, examining the spread of both conspiracy theories and scientific news. The research found that the diffusion of content generally takes place within “echo chambers,” or polarized communities that tend to consume the same types of information. Echo chambers keep the same ideas circulating within communities of people who already support them, both reinforcing the worldview within the community and making members more resistant to information that opposes their beliefs.

Echo Chamber

“Our findings show that users mostly tend to select and share content related to a specific narrative and to ignore the rest,” the study’s authors said. “Whether a news item, either substantiated or not, is accepted as true by a user may be strongly affected by how much it coheres with the user’s system of beliefs.”

This shouldn’t come as much of a shock to many, as Facebook is notorious for using algorithms to present individually-tailored subject matter to its users to keep them happy and active on its platform. The purpose of the algorithms is to serve the user more of what he or she is likely to interact with – assuming that if you like this topic, statement, product, or news story, you’ll probably like this one, too. This way, more ads and content can be specifically targeted to individuals, hence reinforcing their existing points of view. Users remain blissfully happy “knowing,” or rather, believing, more and more about less and less. Pair the algorithms with the natural inclination to seek points of view similar to your own, and you’ve got a perfect recipe for an echo chamber.

Echo Chamber

The study also mentions that the issue of digital misinformation going “viral” via social media has become so pervasive online that the World Economic Forum has classed it as one of the biggest threats to our society.

The Internet remains an uncharted, fast-evolving territory. Current generations are able to communicate and share information instantaneously and at a scale larger than ever before,” says Lee Howell, Managing Director of the WEF. “Social media increasingly allows information to spread around the world at breakneck speed. While the benefits of this are obvious and well documented, our hyperconnected world could also enable the rapid viral spread of information that is either intentionally or unintentionally misleading or provocative, with serious consequences.”

The internet is meant to be a place of pluralism – a community where many voices, often opposing, can be heard. While it seems this may have been true of the world wide web 1.0, as market forces and algorithms increasingly take effect, the diversity of voices is now so filtered and targeted that you may only be hearing reinforcements of what you already believe. So next time you’re checking the latest news on Facebook, ask yourself, “Am I hearing an echo?”

 

Related Articles

- Advertisement -

Latest Articles

- Advertisement -