17 October 2016 383 words, 2 min. read

Big Data and Ethics : how recommendations work at Meetup.com

By Pierre-Nicolas Schwab PhD in marketing, director of IntoTheMinds
There have been many interesting talks at the RecSys 2016 conference in Boston. Yet, the presentation given by Evan Estola of Meetup was especially inspiring to me. It wasn’t the regular technical speech you may expect in that kind of […]

There have been many interesting talks at the RecSys 2016 conference in Boston. Yet, the presentation given by Evan Estola of Meetup was especially inspiring to me.

It wasn’t the regular technical speech you may expect in that kind of conference (although Evan is himself a rather technical guy); rather it gave the audience a perspective and a vision on how ethics and Big Data can (and need to) be combined.

Evan started by giving examples of recommendation systems that went bad. From the Orbitz’ polemic on differential treatment between PC and Mac users to racist search results, through odd recommendations at Amazon. Myriad examples were provided that show that recommendations can hurt and that managers need to take this into account if they don’t want to handle tons of complaints.

9 17-16 – when recommendation systems go bad – rec sys from Evan Estola

Evan then went on explaining how ethical aspects are included in recommendations at Meetup. And that’s the really inspiring part. Taking the example of tech groups in New York, Evan argued that gender should NOT be taken into account in the recommendation model. Otherwise it would be biased towards recommending such groups more to men than women. And that’s where the ethical inclination of Meetup kicks in.

“You need to decide which feature not to use in your algorithm”

Evan Estola, RecSys 2016

The conscient choice of not including one variable and, by definition, biasing the results of the recommendation is an ethical choice by nature. Biases are not necessarily bad. I think that when they cope with a positive choice they are actually more than necessary. A machine doesn’t understand ethics. A machine has no feelings. A machine doesn’t understand cultural differences.

We must get away from the traps of optimizing recommendation engines for the sake of financial gains. We must re-think the place of the customer and must revisit the concept of “value”. Better recommendations do not necessarily mean more value for the customer. Does watching more recommended movies in Netflix lead to more value for the individual? I doubt it.

I leave you with this quote I took away from Evan’s speech :

“the most optimal algorithm is perhaps not the best one to launch into production”

Image : Shutterstock



Posted in big data, Innovation, Marketing.

Post your opinion

Your email address will not be published. Required fields are marked *