Curation algorithms aren’t fundamentally different from recommendation engines (see the article we just published on the latter here). They select what you ought to see, for instance on your Facebook Newsfeeds.
Two studies have addressed customers’ perception of automated curation : Rader and Gray (2015) and Eslami et al. (2015).
Here are how users perceive Facebook NewsFeed’s algorithm.
Are people aware of the existence of NewsFeed curation algorithm?
The first striking fact is that most people aren’t aware of the effect of Facebook’s algorithm.
Eslami et al. (2015) found that 62.5% of users were not aware that stories got hidden by the algorithm. However Rader and Gray (2015) showed that 73% of Facebook’s users said they felt like some stories were hidden.
There seems to be some quite of mystery behind the curation process. In the latter study, only 20% of users speculated that an algorithm may curate automatically content. For the other 80% no explanation was given, which reveals the lack of “education” on this rather technical subject.
Usage determines awareness
Interestingly regular users of Facebook show more awareness of the curation process. Frequent users understood that some kind of ranking existed that prioritized posts on someone’s NewsFeed. This intuition was gained through regular use and the observation of posts being promoted or not.
Eslami et al. (2015) sums this finding up :
most Aware participants stated they gained knowledge about the algorithm via one or two of the following common paths: inductively comparing feeds or deductively considering network size.
Do users want another curation algorithm ?
Interestingly, when people were made aware of the existence of an algorithm (Eslami et al. developed a plugin called FeedVis to do that), 83% reported a behavioral change. In other words FeedVis nudged users’ behaviors (which is a good thing in this case), prompting them to observe what was going on. Even more interesting, when users were let the choice of composing their own NewsFeed, people moved on average 43% of their friends from on category (visible/invisible) to another category (invisible/visible).
There’s little doubt, based on those qualitative market studies, that more education is needed on the role of algorithms in our society. We might also need to leave some more freedom to users so that they can finetune the content proposed by the different website they visit regularly.
If you want to learn more, the Harvard University website proposed a broadcasted presentation of one of the papers. Nathan Mathias offers also some valuable insights in an article on MIT’s website.
Image : Shutterstock