Wednesday, June 22, 2011

Breaking out of the internet filter bubble

Eli Pariser is the former executive director of the liberal activism site, MoveOn.org and co-founder of the international political site Avaaz.org. His new book, The Filter Bubble, examines how web personalization is influencing the content we see online. New Scientist caught up with him to talk about the filters he says are shaping our view of the world, and hear why he thinks it's so important to break out of the bubble.


What is the "filter bubble"?
Increasingly we don't all see the same internet. We see stories and ideas and facts that make it through a membrane of personalised algorithms that surround us on Google, Facebook, Yahoo and many other sites. The filter bubble is the personal unique universe of information that results and that we increasingly live in online.

You stumbled upon the filter bubble when you noticed Facebook friends with differing political views were being phased out of your feed, and people were getting very different results for the same search in Google. What made you think all of this was dangerous, or at least harmful?
I take these Facebook dynamics pretty seriously simply because it's a medium that one in 11 people now use. If at a mass level, people don't hear about ideas that are challenging or only hear about ideas that are likeable - as in, you can easily click the "like" button on them - that has fairly significant consequences. I also still have a foot in the social change campaigning world, and I've seen that a campaign about a woman being stoned to death in Iran doesn't get as many likes as a campaign about something more fuzzy and warm.

Do you think part of the problem is that Facebook is still largely used for entertainment?
It's definitely growing very rapidly as a news source. There was a PEW study that said 30 per cent of people under 30 use social media as a news source. I would be surprised if in 15 years surfing news looks like seeking out a bunch of different particular news agencies and seeing what's on their front page.

We have long relied on content filters - in the form publications or TV channels we choose. How is the filter bubble different?
First, yes we've always used filters of some sort, but in this case we don't know we are. We think of the internet as this place where we directly connect with information, but in fact there are these intermediaries, Facebook and Google, that are in the middle in just the same way that editors were in 20th century society. This is invisible; we don't even see or acknowledge that a lot of the time there is filtering at work.
The second issue is that it's passive. We're not choosing a particular editorial viewpoint, and because we're not choosing it, we don't have a sense of on what basis information is being sorted. It's hard to know what's being edited out.
And the final point is that it's a unique universe. It's not like reading a magazine where readers are seeing the same set of articles. Your information environment could differ dramatically from your friends and neighbours and colleagues.

You have suggested that the filter bubble deepens the disconnect between our aspirational selves, who put Citizen Kane high on the movie rental queue, and our actual selves, who really just want to watch The Hangover for the fifth time. Is there a danger inherent in that?
The industry lingo for this is explicit versus revealed preferences. Revealed preferences are what your behaviour suggests you want, and explicit preferences are what you're saying you want. Revealed preferences are in vogue as a way of making decisions for people because now we have the data to do that - to say, you only watched five minutes of Citizen Kane and then turned it off for something else.
But when you take away the possibility of making explicit choices, you're really taking away an enormous amount of control. I choose to do things in my long-term interest even when my short-term behaviour would suggest that it's not what I want to do all the time. I think there's danger in pandering to the short-term self.

What you're promoting has been characterized as a form of "algorithmic paternalism" whereby the algorithm decides what's best for us.
What Facebook does when it selects "like" versus "important" or "recommend" as the name of its button is paternalistic, in the sense that it's making a choice about what kinds of information gets to people on Facebook. It's a very self-serving choice for Facebook, because a medium that only shows you things that people like is a pretty good idea for selling advertising. These systems make value judgments and I think we need to hold them to good values as opposed to merely commercial ones. But, that's not to say that you could take values out of the equation entirely.

Your background is in liberal activism. Do you think the reaction to your ideas as algorithmic paternalism has to do with a perception that you're trying to promote your own political views?
If people think that, they misread me. I'm not suggesting we should go back to a moment where editors impose their values on people whether they want it or not. I'm just saying we can do a better job of drawing information from society at large, if we want to. If Facebook did have an "important" button alongside the "like" button, I have real faith that we would start to promote things that had more social relevance. It's all about how you construct the medium. That's not saying that my ideas of what is important would always trump, it's just that someone's ideas of what is important would rather than nobody's.

You've repeatedly made the case for an "important" button on Facebook, or maybe, as you've put it, an "it was a hard slog at first but in the end it changed my life" button. Do you think really what you're asking Facebook to do is grow up?
Yeah. In its most grandiose rhetoric Facebook wants to be a utility, and if it's a utility, it starts to have more social responsibility. I think Facebook is making this transition, in that it's moved extraordinarily quickly from a feisty insurgent that was cute, fun and new, to being central to lots of people's lives. The generous view is that they're just catching up with the amount of responsibility they've all of a sudden taken on.

Your argument has been called "alarmist", and as I'm sure you're aware, a piece in Slate recently suggested that you're giving these algorithms too much credit. What's your response to such criticism?
There are two things. One is that I'm trying to describe a trend, and I'm trying to make the case that it will continue unless we avert it. I'm not suggesting that it's checkmate already.
Second, there was some great research published in a peer-reviewed internet journal just recently which points out that the effects of personalisation on Google are quite significant: 64 per cent of results are different either in rank or simply different between the users that they tested. That's not a small difference. In fact, in some ways all the results below the first three are mostly irrelevant because people mostly click on the first three results. As Marissa Mayer talked about in an interview, Google actually used to not personalise the first results for precisely this reason. Then, when I called them again, they said, actually we're doing that now. I think that it's moving faster than many people realise.

You offer tips for bursting the filter bubble - deleting cookies, clearing browser history, etc. - but, more broadly, what kind of awareness are you hoping to promote?
I just want people to know that the more you understand how these tools are actually working the more you can use them rather than having them use you.
The other objective here is to highlight the value of the personal data that we're all giving to these companies and to call for more transparency and control when it comes to that data. We're building a whole economy that is premised on the notion that these services are free, but they're really not free. They convert directly into money for these companies, and that should be much more transparent.

Source  New Scientist

No comments:

Post a Comment