Spoon-fed Opinions - One Post at a Time

Facebook algorithms and online commentary often encourage a lack of critical thinking and form dangerous echo chambers

Scroll through Facebook and Twitter today, and you’ll likely find that much of the critical thinking behind current events or news stories has already been done for you. Articles come prefaced with people sharing their opinions, and then underscoring quotes to highlight their points.

News becomes easily digestible this way – instead of reading a 700 word article on the Syrian refugee crisis, why not just read a short caption to see what activist Linda Sarsour’s take is on the issue? Be it out of laziness, a lack of interest or a bit of both, the social media news cycle doesn’t force us to digest information or think for ourselves.

Instead, we resort to the more convenient option, which is to simply adopt opinions that are presented outright to us. It’s insightful and witty commentary that usually garners more attention than actual news online.

Maybe that’s why newspapers – aside from their ability to keep up with a digital world – have died down over the years. Hard news is bland enough to digest as is, with the only way to gain outsider perspective on an issue being to actually strike up a conversation and present your own views.

Now don’t get me wrong – opening yourself up to other opinions that challenge your own views can only help you better understand any issue. But, I would argue that in the online space, that doesn’t frequently happen. Many of us stop after reading the commentary preceding an article, simply to scroll down and never look back.

Facebook’s echo chamber

One would think being that connected to people all over the globe on social media would expose us to a diversity of different opinions and thoughts. Yet paradoxically, we usually just float through the online space in our own personalized little bubbles.

Often termed the ‘echo chamber’, Facebook is infamous for procuring algorithms designed to create assent rather than dissent. In other words, the site is designed to primarily show us opinions and news we already agree with.

But Facebook algorithms aren’t solely at fault when it comes to creating these chambers – at least to some extent, human behaviour subconsciously and consciously plays a part as well. If you’re anything like me, many of your Facebook friends, much like your “real” friends, are like-minded individuals, who stem from your demographic and share similar beliefs or values. The same goes for the social activists or influencers you choose to follow--we’re more likely to engage with people who share similar ideals or beliefs. Think of it this way: if you’re a vehement white nationalist, you’re probably not going to follow Bernie Sanders on Twitter.

The option of personalization on Facebook – choosing to unfollow people you don’t agree with, liking and commenting on posts from friends who ‘echo’ your beliefs – often goes hand in hand with a lack of ideological diversity. Eventually, the algorithm narrows down your feed based on previous activity to feature the people who share your views and you regularly engage with in order to maximize your engagement on the site. Sure, you might see the occasional conspiracy theory posts pop up from relatives you reluctantly added on social media years ago, but as long as you steer clear of that “like” button, eventually these posts too will disappear from your newsfeed.

Not only are opinions being spoon-fed to us, but they’re often ones that don’t force us to challenge our own views. While there are loopholes out there that can help us break free from the bubbles Facebook has created for us, for the most part, we need to adopt a healthy criticism when it comes to getting our news or opinions from social media. Chances are, Facebook is really only showing you what you want to see.