I just finished reading The Filter Bubble by Eli Pariser, who
is the current president of moveon.org.
In keeping with the interests of that organization, Pariser’s book is an
attempt (at least tacitly) to expand the communitarian and civic capacities of
the Web. But he makes his way there by
arguing that the Web is confining rather than expanding our cognitive horizons. Instead of introducing us to a broader and more
varied set of people, the Web is increasingly taking us to points of view that are
congruent rather than divergent with our own.
With personalized search and personalized social networking, the 'net
introduces us to places and people we already like and that we’re already
interested in. As searching and matching
algorithms improve, we’re increasingly exposed to material that is already
relevant to our lives. This, of course,
is good up to a point: we like relevance.
The downside is that we’re challenged less and less to consider or visit
perspectives that differ from our own.
These trends have been in the works for many years now –
Cass Sunstein famously identified them as far back as 2002 in the book
Republic.com. But, as Pariser argues,
what makes them more worrisome in 2012 is that they’ve become more
insidious. In the past we narrowed our
horizons through conscious acts: we went to nytimes.com instead of foxnews.com
(or vice versa) by choice and more or less deliberately. But as the Web has become personalized, these
choices are increasingly made for us behind the scenes in ways that we’re only
vaguely aware of. When I visit
Amazon.com and shop for The Audacity of Hope, Amazon also suggests I buy Bill
Clinton’s memoir, but not say, Bill O’Reilly’s Pinheads and
Patriots. And when I visit Facebook, my
friends, more often than not, seem to share similar points of view. Pariser doesn’t reference Marx, but the filter
is the modern generator of false consciousness.
In the past we did our own Web filtering.
But now our filters are selected behind the scenes. In the brave new world of the personalized
Web our false consciousness is created for us.
In Pariser’s closing chapter, he offers up a number of things
that individuals, corporations and governments can do to allay the more
insidious effects of filtering. He
suggests that as individuals we occasionally erase our tracks so that sites
have a more difficult time personalizing their content. (To paraphrase Pariser: “If we don’t erase our [Web] history we are condemned to repeat it"). For corporations, he suggests that their
personalization algorithms be made more transparent and that a little
serendipity be introduced into searches so we’re occasionally exposed to
something beyond our current interests and desires. And for governments he suggests a stronger
role in overseeing and regulating personalization.
There are problems with Pariser’s suggested solutions and
Evgeny Morozov, in his own review of Pariser, brings a very important one to
light. In expanding our civic and
communitarian and serendipitous encounters, it would be nice if Google
occasionally popped up a link to “What is happening in Darfur?” when we type
“Lady Gaga” into Google. But who exactly is supposed to decide what these serendipitous experiences are to
be? We may want to allay some of the
cognitive deficiencies that the current 'net breeds. But the danger in doing so is that we replace
one bias with another. In looking a little
further into this I visited the thousands of doodles (e.g. custom banners) that
Google has generated in the past couple of years. Not surprisingly I didn’t see much there
that’s over-the-top civic or political.
But maybe that sin of omission is better than the alternative: I prefer “don't be evil” (their current motto) to “do
good but risk partisanship and bias in the attempt.”
Pariser may not provide convincing fixes, but his description
of the problem makes the book a worthy read.
One would think that as the information stream accelerates we’d become
increasingly subject to distractions and to new ways of seeing the world. In fact, Clay Shirky touches on this point in
“It’s Not Information Overload. It’s Filter Failure:” the filters which the
mass media industry imposed on late 20th century media consumers have been
corroded by the advent of the Web. But the
trends that Shirky makes light of may be
reversing. Our cognitive horizons may be contracting rather than expanding in the age of personalization. And our attention blindness may be increasing
rather than decreasing as the filter bubble grows. In bringing those concerns to light,
Pariser’s has done good work.
No comments:
Post a Comment