Being surrounded by vast expanses of content is so commonplace now that we are no longer amazed by it.
If someone from the 1950s suddenly appeared today, what would be the most difficult thing to explain to them about today?”
“I possess a device, in my pocket, that is capable of accessing the entirety of information known to man. I use it to look at pictures of cats and get in to arguments with strangers.” (source)
Craig Detweiler said in a CPX podcast this week that the idea of searching for answers has changed – the expectation is now that we will find whatever we’re looking for, in moments.
In The Filter Bubble, Eli Pariser (since writing the book he’s become the CEO of Upworthy) writes about the unasked questions that surround this access to information. In his TED talk “Beware online filter bubbles” he gives an overview, but in the book there’s more of a deep dive into his thinking.
Google (and other search engines, with the exception of duckduckgo) made the switch to personalised search, and from there, different people no longer saw the same search results. Google filters the results of searches based on indications of what your past activity reveals as your preferences.
In some way, this is a positive: the information you receive is more relevant to your interest. In other ways, this is a negative – your own biases can be reinforced, rather than challenged.
This filter bubble doesn’t stop at search. Social media sites (Facebook being the largest is the easiest target, but even twitter is moving in this direction) choose which updates will be most likely to keep you on the site, based on a range of criteria.
When you’re making purchases online, careful note is taken of your behaviour: which pitches and discounts you respond to, to create a “persuasion profile” that helps retailers the next time you try and make a purchase in another category.
It’s easier for large companies to track behaviour online across a range of websites (via ad networks, share this buttons and the like), and build more and more detailed profiles.
Pariser argues that this is happening even in the political space, where voters are also profiled: in the US, Google ran a “find your nearby polling place” microsite: a way to provide a service, sell political advertising, or to help augments its user profiles with their political affiliation and likelihood of voting? The opaque nature of multinational companies makes it hard to tell.
In the final chapter Pariser presents a range of actions that companies and the public can take to be able to break out of the filter bubble and explore a wider set of interests: to have the option of not being tracked, to lobby governments around privacy regulations, and having a central government agency (again in the US) to oversee privacy.
Has much changed in the three years since the first edition of this book was published? Certainly. There is more public understanding of the level of tracking that goes on in daily life. But there is also more acceptance of this – a collective shrugging of the shoulders that this is how the world is now, and we’re heading on an inevitable trend line away from personal privacy.
This well-researched deep dive into thinking about the consequences of trusting our decisions to an algorithm is still worth reading.