Eli Pariser’s new book The Filter Bubble: What the Internet Is Hiding from You and his TED Talk video are getting a lot of attention this week.
As web companies strive to tailor their services (including news and search results) to our personal tastes, there’s a dangerous unintended consequence: We get trapped in a "filter bubble" and don’t get exposed to information that could challenge or broaden our worldview. Eli Pariser argues powerfully that this will ultimately prove to be bad for us and bad for democracy.
in the Author Q&A on Amazon’s page for his book, Pariser writes:
We’re used to thinking of the Internet like an enormous library, with services like Google providing a universal map. But that’s no longer really the case. Sites from Google and Facebook to Yahoo News and the New York Times are now increasingly personalized – based on your web history, they filter information to show you the stuff they think you want to see. That can be very different from what everyone else sees – or from what we need to see. Your filter bubble is this unique, personal universe of information created just for you by this array of personalizing filters. It’s invisible and it’s becoming more and more difficult to escape.
Last year, NY Times tech reporter and blogger Nick Bilton published a book titled I Live in the Future & Here’s How It Works in which he cited a research paper by Matthew Gentzkow and Jesse M. Shapiro titled Ideological Segregation Online and Offline in which they found "no evidence that the Internet is becoming more segregated over time."
In an April 2010 column titled Riders on the Storm, David Brooks wrote about this, too.
This study suggests that Internet users are a bunch of ideological Jack Kerouacs. They’re not burrowing down into comforting nests. They’re cruising far and wide looking for adventure, information, combat and arousal. This does not mean they are not polarized. Looking at a site says nothing about how you process it or the character of attention you bring to it. It could be people spend a lot of time at their home sites and then go off on forays looking for things to hate. But it probably does mean they are not insecure and they are not sheltered.
If this study is correct, the Internet will not produce a cocooned public square, but a free-wheeling multilayered Mad Max public square. The study also suggests that if there is increased polarization (and there is), it’s probably not the Internet that’s causing it.
For more, see this blog post from April 2010 by Michael Cervieri titled, Does the Internet put us in Ideological Ghettos?
I’m not too worried about a filter bubble, as my ‘anchoring community’ seem to provide the antidote. Bilton wrote in his book:
I can tell you firsthand that thanks to my anchoring communities, I see a drastically wider range of viewpoints online than I’ve ever experienced reading a print newspaper, watching the nightly news, or reading select niche magazines.
What are anchoring communities? Bilton:
By offering their own digital links and connections, anchoring communities help us cope with the massive numbers of people and the incalculable amount of information online and give us neatly refined selections to sift through together. They help us contain information flow. These social networks provide cognitive road maps that help us navigate all the information and help relieve the mental taxation of trying to manage excessive information on one’s own.
Currently, Twitter is the online tool I use the most to connect me to my anchoring community, both for Northfield-related information as well as everything else. But the cool thing about living, working and being engaged in the Northfield community is that my daily face-to-face roaming about provides this, too.