Technology and especially algorithms have invaded the public space. According to Alain Damasio “Freedom is not a desirable concept anymore”. Security and ease-of-use now comes first, as the success of recommendation engines tends to prove.
We are surrounded by computer code suggesting choices and taking decisions for us : from Google search box making suggestions to Netflix’ recommendation engine, through Facebook EdgeRange algorithm.
Those algorithms do help us for sure. The amount of possibilities surrounding us are just tremendous and without algorithms our online behaviors may well be less efficient. Yet, those algorithms also have a very perverse effect. They nudge us to change our own behaviors, to be more disciplined to achieve what the algorithm expects from us.
Let’s take two examples : Facebook’s and Netflix’s.
Facebook’s users change their behaviors to become more visible
Consider first of all Facebook’s algorithm called EdgeRank. Although the real mechanics remains hidden in a well-sealed black box, we can reasonably argue that a post popularity will depend on the number of likes, comments and shares. The popularity of the author plays also a role.
As Taina Bucher remarkably shows in her 2012 article, EdgeRank rewards users with visibility. She writes :
In Facebook there is not so much a ‘threat of visibility’ as there is a ‘threat of invisibility’ that seems to govern the actions of its subjects. The problem as it appears is not the possibility of constantly being observed, but the possibility of constantly disappearing, of not being considered important enough. In order to appear, to become visible, one needs to follow a certain platform logic embedded in the architecture of Facebook.
The algorithm trains users (who implicitly notice the effect of the algorithm as studies by Rader and Gray (2015) and Eslami et al. (2015) showed). Like in a gigantic Pavlovian experiment, Facebook’s users expect their reward in exchange of their algorithmically-conform behaviors. As Bucher puts it :
EdgeRank, by functioning as a disciplinary technique, creates subjects that endlessly modify their behaviour to approximate the normal.
Netflix create movies that will get the best ratings
Netflix’s recommendation engine uses ratings to predict movies a customer is likely to appreciate. Netflix has gained considerable knowledge in what people wants to see and has been able to produce a segmentation which breaks completely from the old-style socio-demographic rules that have been used until now. Netflix has an algorithmic knowledge of all movies’ subtleties that will appeal to the greatest audience and can therefore decide on a recipe to maximize its chance of success as far as ratings are concerned. Hallinan and Striphas (2016) explain :
Data collection and interpretation permeate many aspects of corporate decision making, from the vetting of potential acquisitions to the shaping of the context of acquired properties. Netflix is not simply hiring auteurs whose unique vision for a production prevails over all else. In the case of Orange Is the New Black, Sarandos reports that
Netflix has exerted “a lot of casting influence” over the property based on what its algorithms suggested would be the most effective choices for actors in terms of attracting audiences and new subscribers (quoted in Rose, 2013). The company has taken an equally radical step in choosing to release an entire season of its shows all at once, rather than doling out one new episode per week at a regularly scheduled time. The shift away from “appointment viewing,” long prevalent in traditional television, to “binge viewing” grew out of Netflix’s analysis of viewing data, which showed its streaming customers tended to watch several TV episodes back to back instead of one at a time. The insight has affected both the structure and content of these shows, allowing scriptwriters to sidestep recaps, cliff-hangers, and similar narrative devices intended to keep viewers glued between commercial breaks and from one week to the next (Rose, 2013). While Netflix has not extensively described the relationship between its viewing data and production decisions, Sarandos has affirmed that the company’s goal is “to optimize for the best shows,” informed by “data-driven hunches” (quoted in Karpel, 2012).
Conclusion : algorithms change the rules of the game
The Facebook and Netflix cases show how behaviors are changed to cope with algorithms’ expectations. In the first case users change their expectations ; in the second case producers adapt their content to maximize the algorithm’s response.
The danger of algorithmic regulation (beyond the loss of freedom already tackled in this article) is the normalization of behaviors. “Commenting and participation become processes through which the subject may approximate [a] desired normality” (Bucher 2012).
What is a society where all members tend to normality? Well, it may well be a mediocre society.
Image : Shutterstock