Living Inside the Filter Bubble: How Search Features Reinforce Your View of the World

Today’s post is a corollary to my last one about how search engines can help perpetuate misinformation. In that post, I cited psychiatric research that showed how people tend to disbelieve information that disagrees with their view of the world. Today’s post is about how web technologies can filter out information that disagrees with their view of the world … before they even see it.

PrintImagine your significant other always agreed with you. “Fat chance,” you say.

Now imagine your kids always agreed with you. “You’re dreaming.”

And your boss. And your co-workers. “Get real!”

And all registered voters. “You’re losing touch with reality.”

Could be we all are … if we rely on many search features of popular web sites.

Results Tailored to Your Likes

Search algorithms limit information presented to users based on factors such as location, language, past click behavior and search history. This process progressively filters information over time as search features get to know your likes and dislikes. As a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles. This phenomenon has been dubbed “the filter bubble.”

The term was first used by Eli Pariser who wrote a book on the topic, The Filter Bubble: How the Personalized Web is Changing What We Read and How We Think. Examples:

  • Personalized search results
  • Personalized news streams
  • Personalized shopping

Logical consequences of living in a technology-created, self-perpetuating bubble:

  • You get less exposure to conflicting viewpoints
  • You become isolated intellectually
  • You become closed off to new ideas, subjects and important information
  • You get the impression that your narrow self-interest is all that exists
  • Your outlook narrows

In an example related by Pariser, a broker searched Google for “BP” and got investment news about British Petroleum, while an activist got information about the Deepwater Horizon incident in the Gulf of Mexico.

According to Pariser, filter bubbles can undermine civil discourse (we’ve seen plenty of that in lately) and make people more vulnerable to “propaganda and manipulation.” In 2011, the Economist quoted him as saying:

“A world constructed from the familiar is a world in which there’s nothing to learn … (since there is) invisible autopropaganda, indoctrinating us with our own ideas.”

The flip side to the intellectual isolationism argument is, of course, convenience. How many of us would use search engines if they fed us pages in languages we didn’t understand, led us to pizza places a thousand miles away, or pointed us to information that never seemed relevant.

Some people question the extent to which filtering actually takes place. But Google recently introduced an option to let people opt out of filtered searches. This development happened after a new browser called duckduckgo proudly touted it’s unbiased, filter-free searches. I smell smoke.

Linkedin Facebook Twitter Email
FacebooktwitterredditpinterestlinkedinmailFacebooktwitterredditpinterestlinkedinmailby feather