Online services like Google and Facebook use computer programming algorithms to determine what information to deliver to you. Your “filter bubble” (a term coined by internet activist Eli Pariser) refers to the idea that this automated personalization, though helpful in some ways, can isolate you from other information. Sometimes referred to as an "echo chamber," the filter bubble created by your online activity can limit your exposure to different points of view and weaken your ability to avoid fake news and bias.
In this now-famous TED Talk, Pariser discusses the effects of algorithms and warns us about the dangers of online filter bubbles.
What makes it so easy to believe fake news? As explained in the video Defining Confirmation Bias, people have a tendency "to accept information unquestioningly when it reinforces some existing belief or attitude," even when presented with contradicting proof.
According to Psychology Today, "confirmation bias occurs from the direct influence of desire on beliefs. When people would like a certain idea/concept to be true, they end up believing it to be true. ... We pick out those bits of data that make us feel good because they confirm our prejudices. Thus, we may become prisoners of our assumptions."
In this blog post on confirmation bias, author David McRaney reminds us that "there’s always someone out there willing to sell eyeballs to advertisers by offering a guaranteed audience of people looking for validation." As McRaney admonishes, we must ask ourselves if we're in that audience.
You can never get rid of all of your biases, but you can actively seek out other points of view. You can't get rid of your filter bubble either, but you can take steps to manipulate it. Here are some suggestions: