For Eli Pariser, co-founder of the political institute Move On, algorithms used by sites like Google and Facebook isolate the user in a bubble.
The PageRank algorithm, until then used to calculate and display the most relevant search results, gained a companion. Google announced a second algorithm. Its function: customize the search. Thus, the calculation and display order of the results would take into account the user's browser activity history in the last 180 days.
That is, the search results would be tailored to the user's profile — to their click history. Since then, the results have been different from one user to another, even if in a subtle way. The new algorithm shapes the results according to the personal tastes of whoever is at the keyboard.
Welcome to the personalized internet. Or rather, to your internet. For she is not the same as your friend. Like it or not, the web has a filter. Your filter. “We tend to think of the web as a gigantic library, in which services like Google supply us with a universal map. That’s not the case anymore,” says Eli Pariser, co-founder of the political anti-terrorism institute Move On and author of The Filter Bubble: What the Internet Is Hiding from You.
Google is not alone in this. Facebook, Apple, Microsoft, Yahoo! and Amazon are also betting on personalization — or customization — of the internet. According to Pariser, the big websites have become predictors of our tastes, our profiles, and, of course, what we would like to read, watch and consume online.
The giants' formula is simple: the more relevant and personal the service offered, the more ads will be sold. And advertisers will therefore market more products.
historical relic
For the director of operations of Facebook, Sheryl Sandberg, in a few years the site that is not customized will be seen as a historical relic. On Facebook, customization is done by the EdgeRank algorithm. The New York Times and Washington Post have created systems to recommend articles to readers according to their profile, analyzed when they log in.
At Amazon, according to a study by consultancy McKinsey, an average of 30% of store sales come from its customer recommendation system. On Netflix, the online DVD rental company in the United States, this percentage reaches 60%. It is a great business tool, which has helped to improve the user experience on sites like Google and Facebook.
According to Google engineer Jonathan McPhie, the site's click rate increased after implementing the second algorithm. But a Google spokesperson says the word filter is inappropriate, as the site would only be prioritizing results, without filtering or omitting anything. Eli Pariser does not agree. He argues that by personalizing the content offered, websites are isolating us from other people.
In the old days — and December 2009 is long past — we were all connected, enjoying common online content. The custom web had the effect of isolating the user in a bubble. The content is so vast that prioritization itself is a form of filtering. “Each IP became a tool, and then the tools shape us,” says Pariser, echoing the phrase of media theorist Marshall McLuhan. The first risk of the filter is to generate sameness. This is what the writer and activist Pariser calls the “Chipotle problem”, in allusion to the Mexican chain of fast-food restaurants in the United States.
Everyone enjoys their tacos and burritos. “It's a consistent three to four star experience. But it doesn't make anyone freak out, it's far from five stars,” says Pariser. The way recommendation algorithms are designed, in order to make safe guesses, it is inevitable that more and more people will receive Chipotle-type recommendations. Algorithms are proven to work, preventing us from getting ripped off. But at the same time they take away the pleasure of the new. The algorithm avoids extremes: the bad and the exceptional. And also the surprises, which can be good.
personal seasoning
“But the big risk,” says Pariser, “is turning the web into a big egotrip. It has been proven that the media has the effect of carving, to a certain extent, the identity of those who consume it.” In the “open” world, where information flows freely, the reader is challenged to assimilate new information, which questions his beliefs, proposes different things. With custom content, this is not the case.
Websites — getting to know us better and better — tend to offer a more palatable version of reality, in the right seasoning to our personal taste. According to Pariser, this creates a strange cognitive loop, like a highway balloon, where the content that feeds us is the mirror of ourselves. But no one chooses to enter that bubble.
“In mainstream media, if you were more liberal, you chose CNN news. If it was more conservative, it was on Fox News. But the decision was yours. With the information filtered out, you become unaware of the process. The choice is made on your behalf,” says Pariser. The problem is that customization is here to stay. “The genie never goes back in the bottle,” says Pariser. “But you should have a choice whether or not to enter that bubble.”