6 September 2017 578 words, 3 min. read

Workshop on fairness and ethics in recommendation systems

By Pierre-Nicolas Schwab PhD in marketing, director of IntoTheMinds
On the last day of the RecSys 2017 conference I was fortunate enough to be in the organizing committee of the workshop on fairness, accountability and transparency in recommendation systems (F.A.T.REC). This workshop was the first of its kind within […]

On the last day of the RecSys 2017 conference I was fortunate enough to be in the organizing committee of the workshop on fairness, accountability and transparency in recommendation systems (F.A.T.REC). This workshop was the first of its kind within the RecSys community ; the conference is indeed usually very much tech-oriented and Joe Konstan, who chaired the very first ACM Recsys conference, indicated in a plenary session that he wanted more space to be dedicated to ethics in future conference. Mickael Ekstrand, who will chair the RecSys 2018 conference in Vancouver and whom I invited to Brussels for an introductory talk on ethical recommender systems at Digityser, will certainly have at heart to fulfil Joe’s wish.

Based on the various discussions we had on that day it came to my mind that fairness in algorithmic recommendation systems needs to be better defined. It appeared to me that we often had very relevant arguments; yet the relevance of those varied greatly according to the context.

What does (un)fairness mean ?

Unfairness should not be confused with discrimination as Krishna Gummadi (Max Planck Institute for Software Systems) recalled. Discrimination is a special type of unfairness.

Defining a common understanding of the very concept of fairness applied to recommender systems is therefore more than necessary.

What is the risk of being unfair ?

The second question that you should ask yourself is how big the risk of being unfair in the recommendations really is. There is a big difference between a music recommendation service and a news recommendation service.

What are the consequences of biased recommendations in a subscription-based service like Spotify ? Getting a track recommended that you may not like and will skip. The consequences are small for the consumer.
They may however be bigger in a context of news recommendation where the consumer may be exposed to a polarized perspective.

In tried to summarize this in the figure below

High-involvement Low-involvement
Low control
Low transparency
High risk of unfairness Medium risk of unfairness
High control
High transparency
Medium risk of unfairness Low risk of unfairness

The highest risk I see in contexts where the consumer’s involvement is high, i.e. in contexts with conscious decision-making. News consumption, first time or highly involving purchase decisions, choosing a partner on an online dating site are examples of such contexts. Imagine you have to buy a new car and rely on a specialized website to compare features and make a choice. Proposing biased recommendations would have high consequences. If the user is not aware of the presence of an algorithmic recommendation mechanism and has no control on it, the risk is high.
On the contrary getting a movie recommended on Netflix or an adversary on an online game has a low risk of creating damages to the user even if he/she has no control over the algorithm. Actually John Kolen of EA Games explained at RecSys 2017 how adversaries are picked up by the machine to ensure maximum player’s enjoyment. And believe it or not the algorithm will not always chose weak adversary for you to fight against. In certain cases the algorithm will actually chose player that it is certain you’ll lose against.

Is the user aware of the recommendation’s bias ?

The last thing to consider is whether the user is aware of the presence of an algorithmic system. As a study pointed out, most users on Facebook are unaware of the algorithmic curation that takes place. More transparency is therefore highly needed.



Posted in big data.

Post your opinion

Your email address will not be published. Required fields are marked *