Recommendation engines (a series of algorithms aiming at targeting what you’d like best to see / get / purchase / you name it) are everywhere. You use them without knowing it.
The first recommendation engine is actually Google (or any search engine) that aims at recommending you the most relevant content for you among millions (if not billions) of possible pages.
Those recommendation algorithms make choices for you and exclude de facto content that would be possibly interesting to you (have you ever gone further than page 1 of Google’s search results ?). These algorithms act as filters and because of the choice they make for us, Eli Pariser invented the word “filter bubble” to describe how much our freedom is limited by machines, trapped into a bubble.
In today’s post I’d like to elaborate on this topic and propose new solutions to escape the filter bubble.
Decide on the level of serependity
Serependity is defined as
“the faculty of making fortunate discoveries by accident”
In a world where information is geared toward us by algorithms, the likelihood to make fortunate discoveries decreases. Recommendation engines are indeed basing their recommendations on past behaviors which exposes us to keep consuming more of the same, with sometimes alarming consequences.
My proposal is therefore to let the user decide on the very level of recommendations he/she wants to get, rather than imposing recommendations decided by a black box. Astonishingly, after reading only very recently the book by Eli Pariser, I found out he proposed a similar idea :
Google or Facebook could place a slider bar running from “only stuff I like” to “stuff other people like that I’ll probably hate”
This is in fact exactly what I’m dreaming of. A slider that would give the control back to the user and let him/her chose the amount of recommendations he/she wants to get. The less you get, the higher the likelihood to discover new content (talking about new content and opposing ideas, don’t forget to check this article on the expression of political ideas on Facebook).
Propose the opposite
An even more radical option would be to propose the user the opposite of what would usually be recommended to him. This obviously doesn’t apply to consumption goods (who would like to be recommended something he really doens’t need) but rather to news consumption, the field where recommendations engines are of utmost importance nowadays.
I imagine a screen split in two halves : on the left-hand side the news which would naturally be recommended to you based on your own behavior or on collaborative filtering techniques. And on the right-hand side of the screen, stuff that you would normally not be exposed to given your profile.
There is no discussion that recommendation engines are worthwhile business allies in commercial organizations. Think about stuffs that Amazon and other online retails get sold thanks to their personalized recommendations.
There are however growing concerns about the negative consequences of recommendation algorithms and Big Data in general. Lack of freedom is one topic that I dealt with already. In this context promoting serependity is essential to achieve the libertarian aims of the Internet (which is far from being a libertarian ecosystem nowadays). Serependity (making fortunate discoveries) and curiosity are two faces of the same coin. If you ain’t looking for something, you won’t make fortunate discoveries.
It is essential to trigger curiosity; Media companies need to arouse people to awake that type of good curiosity that philosopher Hume described as “love of knowledge”. Knowledge and culture are the only remedies to the threats that challenge currently our democracies.