World domination in a feedback loop
When those of a certain generation first hear of the "Filter Bubble," they might reflect on that brief two weeks in the mid-'90s when the band Filter was kind of popular. These days, the Filter Bubble, according to former MoveOn.org executive director Eli Pariser, is the means by which the Information Superhighway functions more like a private driveway upon which only targeted and personalized information travels at the expense of the broader range of knowledge.
This shift to personalization raises as many questions about one's online privacy as it does about censorship, whether it's intentional or the result of an algorithm trying to give you what it thinks you might like - or buy. As Pariser explains in his book, "The Filter Bubble: What the Internet is Hiding From You," a personalized world is one that only serves to confirm our existing beliefs as determined by the digital breadcrumbs we've left along the way. When we only receive information aligned with our religious or social or political beliefs, "It's difficult to maintain perspective," suggests Pariser. Or, as he whimsically put it during a recent chat at Seattle's Town Hall Center for Civic Life, "When you step to the side for a new perspective, it's as if the world moves to meet your gaze."
In an Amazon Q & A, in which Pariser credited the bookseller for its relative transparency as regards its product suggestions with the apropos example, "We're showing you Brave New World because you bought 1984," the Internet activist explained, "Research psychologists have known for a while that the media you consume shapes your identity. So when the media you consume is also shaped by your identity, you can slip into a weird feedback loop."
"The technology is invisible. We don't know who it thinks we are, what it thinks we're actually interested in," Pariser said to The Atlantic. "It locks us into a set of check boxes of interest rather than the full kind of human experience."
Of course, the Filter Bubble is more the unintended consequence of a business strategy than an insidious plot on the part of a gaggle of geeks in Silicon Valley to control your Internet experience and by extension your thinking. The fact is, in some cases, we're censoring it ourselves. As Pariser recently explained on KQED's Forum, "Because Facebook mainly uses how many people 'like' something as a means of figuring out what they should show other people, what that means is that you see well-liked news on Facebook."
Google, however, has 57 different ingredients in its secret sauce. Even when logged out of your account, Google gleans signals from your online behavior and applies them to a profile that it uses to tailor results more to your liking. Consequently, like a fingerprint, no two search results are the same for different users. Try it - it's spooky.
It's no surprise Yahoo! News is experimenting with personalization, but few are aware that institutions like the New York Times are as well, in their online iterations and apps. I'm confident SonomaNews.com is not doing this, though if you find yourself reading me to the exclusion of all else, something has either gone horribly wrong with the system or my plans for world domination have kicked in, which, admittedly, is sort of the same thing.
It should be noted that the sources for this piece came entirely from links presented through the various mechanisms Pariser describes, so it's likely only part of the story - the part the robots intended to be seen by someone in the media to be shared with those it knows are reading that media. When I asked him personally what content producers could do to override the algorithm, Pariser essentially said we're SOL: "Content creators are at its mercy." Time to crank the Filter.
• • •
Daedalus Howell is filtered at the Future Media Research Lab; visit FMRL.com.