The filter bubble theory is once again being questioned. This time it is American research that shows that it is the composition of our network that determines whether we will be trapped in a filter bubble or not. In particular, “weak links” are crucial to be exposed to diverse content. This article will explain everything.
If you only have 30 seconds
- an American study (another one) questions the theory of filter bubbles. It was conducted on the users of Facebook
- it shows that exposure to dissenting opinions is much more common (87.1%) than the filter bubble theory would suggest
- The ” weak links” are essential, to be exposed to opinions different from our own. Weak links are the people in our networks with whom we only have distant relationships.
- One factor, in particular, is crucial: the diversity of the network in terms of ethnicity and religion.
Filter bubbles are the theory, popularised by Eli Pariser in 2011, that algorithms recommend only what we are used to consuming and, in doing so, lock us in. This problem would be particularly prevalent on the sites we visit to inform ourselves, first and foremost social networks. Our initial beliefs are reinforced by algorithms that only expose us to what we already like. We then lock ourselves more and more into these beliefs.
So much for that theory. Except that since 2015 the scientific evidence has been accumulating that is undermining that theory. A new study, published in the Journal of Social Media in Society, reinforces the doubts already present about the very existence of algorithmic filter bubbles. It also explains how our network is helping to counterbalance the potentially adverse effects of algorithms.
Algorithms have no control over the constitution of your network. However, it is this network that determines your level of exposure to diverse information.
The majority of respondents (87.1%) are confronted with views that are different from their own. A minority is rarely (11.1%) or never (1.8%) confronted with information that differs from their beliefs.
Exposure to these different points of view is mainly the result of “weak” connections, that is to say, people in the network with whom the respondents only have distant relationships. “Strong” connections (family, close friends) contribute significantly less to exposure to topics that diverge from respondents’ opinions.
One of the explanations put forward is the multiplication of individuals’ online identities. The concept of “ideological silos” (cf. Sunstein 2003, 2017) only makes sense if an individual has a single identity. Today, however, our online presence leads us to have several adjacent identities, which sometimes overlap. My identity on LinkedIn is not the same as on Facebook, nor on a forum dedicated to one of my hobbies. These different places of expression, centred around my needs and the resulting identities, also involve various networks. These networks often overlap, and “friendships” are often shared across several networks over time.
The research is based on an online survey of Facebook users (N=271). The method differs from that of Bakshy et al. (2015), also conducted on Facebook but on a much broader cross-section.
Also, the data are declarative, where Bakshy et al. used observed data.
Nevertheless, the study has the merit of discriminating between strong and weak links. The use of perceptions reported by respondents is also common. Several other studies have cited that use this methodological approach.
One interesting methodological point is worth noting. The authors use the Herfindahl-Hirschman index to measure diversity in the respondent network. The results indicate that the factors “ethnicity” and “religion” are highly correlated with the variety of opinions that respondents are confronted. In other words, having people of different ethnicity or religion in one’s network increases the likelihood of being exposed to views different from one’s own.
Bakshy, E., Messing, S., & Adamic, L. A. (2015). Exposure to ideologically diverse news and opinion on Facebook. Science, 348(6239), 1130-1132.
Min, S. J., & Wohn, D. Y. (2020). Underneath the Filter Bubble: The Role of Weak Ties and Network Cultural Diversity in Cross-Cutting Exposure to Disagreements on Social Media. The Journal of Social Media in Society, 9(1), 22-38.Tags: recommendation algorithms