The interest for the “Filter Bubble” phenomenon reached an all-time high after 2016 US presidential elections (see our article).
The existence of filter bubble was hypothesized by Eli Pariser in 2011, yet it remains at the center of debate on his actual existence. According to Pariser the algorithms implemented to make our digital lives easier are an impediment to serendipity. According to opponents the Internet offers a limitless range of resources which increase the potential for serendipity.
There are very few scientific empirical papers analyzing the existence of the filter bubble. Bakshy, Messing and Adamic (2015) explored filter bubbles in the field of political opinions on Facebook. They concluded that Facebook increased exposition to different opinions. Yet their work suffered from the affiliation of the main author with Facebook Labs. That’s why I was really happy to read another empirical research conducted by Seth Flaxman (Oxford University), Sharad Goel (Standford University) and Justin Rao (Microsoft Research).
The research was published in 2016 and conducted on the online reading habits of 50,000 anonymized US users. If you want to read the technical stuff, the paper is available here.
Otherwise let me jump to the conclusions :
- social media does increase the phenomenon of segregation, also called polarization, leading to reading only the stuffs that comfort you in your ideology
- the availability of limitless online resources does lead to ideological segregation ; the effect is marginal for descriptive news (94%of the consumption) and substantial for opinion stories (6%)
- only 1 in 300 clicks on Facebook leads to reading substantive news articles. the 299 remaining clicks go to video- or photo-sharing websites
- 78% of users get their news from one source only, 94% from two sources. Those sources are mainly mainstream.
- the most extremely ideologically oriented users (2%) expose themselves to a high variety of information sources from the same ideology. This exposition occurs through direct browsing. Algorithms play no role.
The authors conclude that “though the predicted filter bubble and echo chamber mechanisms do appear to increase online segregation, their overall effects at this time are somewhat limited“.
For the vast majority of users (94%) information sources are limited in number (<=2) and mainstream. Users with more extreme opinions browse directly a broad range of sources reinforcing their beliefs.
This research therefore shows that we are living in echo chambers that we create ourselves; algorithms only marginally contribute to reinforcing those bubbles.