4 July 2018 749 words, 3 min. read

Societal and ethical issues of digitization: are emotions covered by GDPR?

By Pierre-Nicolas Schwab PhD in marketing, director of IntoTheMinds
An academic paper by Royakkers et al. (2018) is currently available from “Ethics and information technology” magazine that offers an interesting overview of the societal and ethical issues of digitization (hence its title). Digitization issues are analyzed along 6 axis […]

An academic paper by Royakkers et al. (2018) is currently available from “Ethics and information technology” magazine that offers an interesting overview of the societal and ethical issues of digitization (hence its title).
Digitization issues are analyzed along 6 axis :

  • privacy
  • autonomy
  • safety and security
  • balance of power
  • human dignity
  • power

It’s not my intention here to sum up the whole paper; I’d rather want to focus on one aspect at a time. In today’s article I’ll therefore discuss the aspects related to the automatic analysis of emotions.

Analysis of sensitive information

I already dealt with the analysis of emotions in an article I published after I met with Andrew McStay at the CPDP conference in 2017 and another one after a presentation at the same conference.
Royakkers and colleagues rightly stress that biometric information can be used to enhance safety. In applications like identification and recognition, biometric information is instrumental to grant access and authorization. While such applications were somewhat reserved to “official” applications in the past (for instance by customs), cheap processing power and databases of visual material make it possible to experiment beyond the frontiers of official usages. Sentiment analysis has become quite widespread for instance. I experimented it a few years ago on tweets and public facebook messages (with little success however) and since then it has become a mainstream technique applied on photo material. Applied in real-time and embarked in cameras, it enables to detect emotions of individuals whose image is captured on camera (see the example of intelligent shelves by AMS).
The authors write :

“the next generation of biometrics not only gives insights into “who you are” but also focuses on the question “how you feel” […] This is an invasin of a new field of privacy namely “mental privacy”

Are emotions a personal data?

This raises the question of whether emotions are a private data by nature. I’d frame my answer to that question by looking at the GDPR. My answer is therefore “it depends”.
It depends first of all on whether or not the data is saved. In some cases data doesn’t need to be saved. When Saatchi, Clear Channel and Postercope launched “intelligent” posters, there was no need to save any data. The advertising message was personalized based on the emotion of the person in front of the poster. A set of messages was preloaded and depending on the emotion detected it was displayed. I don’t see any need to record anything at the individual level. A system conceived with privacy-by-design in mind would deliver only aggregated data (x% of joyful people on day 1, y% of sad people on day 2, etc …)
If data is recorded at a more granular level it becomes tricky. The first question I’d ask is why is it needed. There may a good reason (for instance direct marketing after the emotion is measured). But in any case the very nature of the data (emotions are fugace) makes it compulsory to have a very short lifecycle and to erase that information after 24 hours. Why would you keep an emotion on record if that information become obsolete after 24 hours ?

Emotions become personal data if kept on record (but GDPR may forbid you to do so)

From the above flows naturally that emotions are personal data if they can be associated to an individual. From the GDPR it flows also that emotion-related data should have a very short lifecycle and be erased quickly (except if you can justify of a good finality but honestly I don’t see one).
Outside of Europe however citizens will be less or not all protected against bad practices. Let’s take for instance the above case of intelligent retail shelves. Because consumers in supermarkets use to buy always the same items in the same sales points, it is highly likely that you’ll see the same faces every single week. A potential bad practice would therefore be to record emotions on a weekly basis and to purposelessly investigate one’s emotional evolution.

Conclusion

Someone’s emotional state is very personal by nature. Advances in AI has made it possible to analyze emotions in real-time which opens new perspectives to a lot of sectors (for instance advertising).
In case such information is recorded and saved, GDPR legal framework would impose that the lifecycle of that information be very short given the fugace nature of the data. However GDPR is only of application in Europe and to European citizens which leaves in the US the door open for questionable practices to come.

Image : shutterstock



Posted in big data.

Post your opinion

Your email address will not be published. Required fields are marked *