Living between the linesNotesObservations

New Digital Divides: The Personalized “Filter Bubbles” Menacing Democracy

ObservationsInstead of linking humans together, could digital technologies isolate them from each other? Could personalization of web services produce ghettos? Could it threaten democracy itself? These are the dangers raised by Eli Pariser, president of MoveOn.org, on June 3, 2010, during the last Personal Democracy Forum.

Ethan Zuckerman reported his remarks. First, an example of a personalized conference:

“What if we came to an event like Personal Democracy Forum, and sorted ourselves by gender, age, political ideology, hometown. Pretty soon, we’d all be sitting in small rooms, all by ourselves. What if speakers then offered personalized talks, adding explosions for the young male listeners, for instance. “You’d probably like your personal version better… but it would be a bad thing for me to do.” It renders moot the point of a conference – we no longer have a common ground of speeches that we can discuss in the hallways.”

“Google uses 57 signals available to personalize the web for you, even if you’re not logged in. As a result, the results you get on a Google search can end up being very different, even if quite similar people are searching. Eli shows us screenshots of a search for “BP” conducted by two young women, both living in the North eastern US. They get very different results… one set focuses on business issues and doesn’t feature a link on the oil spill in the top three, while the other does. And one user got 141 million results, while the other got 180 million. Just imagine how different those results could be for very different users.”

Now, let me take a moment to talk about what is personalization. This is the automatic creation of web pages tailored to the specific interests of a particular user from the indications that person has explicitly provided or from what have been observed through that person’s use of the web. This is the production of your individual user profile from the personal information you produced yourself about you or were deducted from the gestures you made through digital devices. The typical case is the online bookstore that will offer suggestions for books that might interest you from your own browsing profile and that of thousands of other consumers like yourself. The typical line being: “The buyers of this book often bought those other following ones.”

Except that Pariser is talking about the customization of our relationship to the world around us. For example, Zuckerman et Marcia Stepanek report that Pariser claimed having serious difficulties to add people of conservative views in his circle of Facebook contacts and then to follow them. Facebook notes that he is interested in Lady Gaga and progressive ideas and then personalized his experience by blocking links to people with conservative ideas.

Personer calls this particular type of echo chamber: “filter bubble”. According to him, the more efficient the filter becomes, the less chance of being exposed to novelty, the unexpected or the disturbing.

According to Pariser, these bubbles have three characteristics:

1 – The degree of personalization is very high. You no longer belong to the thousands of readers of The Nation: You’re alone in your particular bubble.

2 – This bubble is invisible. Google does not inform you that it creates a custom bubble for you. You do not know its existence, let alone the implications.

3 – You do not choose this filtering. Rather, it chooses you. This is not your decision, but others’.

For Pariser, “the filter bubble may be good for consumers but it’s bad for democracy.” “We need to start questioning the values of these filtering devices and get the power back to make these decisions for ourselves.”



line
footer
Powered by WordPress | Designed by Elegant Themes
?php comments_popup_link(esc_html__(online ) { ?template_directoryline ))) { ?!--End Footer-- /a) { ?div style=ebusiness_integration_single_top) { ?ul id=