A piece explaining why your twitter or facebook feeds probably aren’t telling you the whole story. Everyone’s online experience is curated to bolster what they already believe. This has horrible implications for a democratic society.
How can we discuss important news when our fundamental understandings of relevant facts are designed to clash?
Filter bubble is the term coined for the algorithms helping to curate your news feeds across social media platforms in ways that appeal to your personal biases. Your filter bubble is based on content you’ve interacted with in the past, and it promotes content it expects you will interact with in the future.
Picture this: An important story is born in the real world, and it will soon be big news. Before this story reaches you, it is rehashed and recontextualized by numerous media outlets, each applying its own slant. Two reporters could come to completely different conclusions regarding who’s guilty, who’s lying, or who’s being set up. Your filter bubble helps choose which of these reports should appear on your timeline.
This is worse than the bipartisan effect of televised media, because on social media, information is tailored to you as an individual, often designed to exploit your psychological weaknesses. Everything you post and interact with, from puppy videos to pornography, is considered by an algorithm whose job is to keep you online.
I did an experiment, and I encourage you to do it too. This will only work if you’re a Twitter user, and probably only if you often engage with political content. Look at the replies to a tweet by a controversial figure you’re interested in, but do it twice: Once while logged into your own account, and once from a different browser, logged out.
I did the experiment using this tweet from Donald Trump. It’s a video of Joe Biden saying he’s never talked to his son about business dealings in Ukraine, and then an edit of the video for Photograph by Nickelback in which the singer holds up a photo of Joe Biden, his son, and a Ukrainian oil executive.
From my own account, in the replies to this tweet, I mostly saw either completely unrelated tweets, or tweets agreeing that the video demonstrated Joe Biden lying. Then, in a different browser, logged out, I saw nothing but replies ridiculing Trump.
I am not here to tell you which worldview presented by Twitter’s choice of replies is correct. I am here to remind you that you can not trust what social media shows you to be representative of reality, because what you are shown is tailored to bolster your preferred version of reality. When you click on a trending topic and see unanimous positivity, someone else might do the same thing and see nothing but attacks.
This has horrible implications for a democratic society. Social media companies hold the power to grow or shrink thought-groups by prioritizing or deprioritizing content sympathetic to them. Elections can be influenced through targeted media designed with psychology in mind (this was the crux of Trump’s Cambridge Analytica scandal). The things I believe about today’s top news can be completely opposite to the things you believe, because accuracy has taken a back seat to clickability, and all we’re left to do is argue over whose sources are more credible.
It has never been more important to actively seek out people you disagree with and attempt to understand why they believe what they do. No one is right about everything, but people conscious of that fact will find themselves becoming right a lot more often than people that aren’t.
Follow me on Twitter @FakeProfileGuy