During a recent programme on a French television channel (BFMTV), Paul Hermelin, the big boss of Cap Gemini, described the fantastic current technologies that were impacting his group and mentioned a project that surprised me. With the help of Cap Gemini, a cruise-ship operator has allegedly set up a facial recognition system to measure the emotions of its customers on board the ship(s), to be able to determine their customers’ satisfaction. In the words of Paul Hermelin during this program:
“[Cap Gemini] is creating a facial recognition program for a Caribbean cruise company that examines the faces of its customers to see if they are happy or not and then adapts its programs based on the satisfaction measured on their faces”.
The evocation of this project (of which I have found no trace on the Internet) has surprised me because it makes the memories of previous publications on subjects that seem to combine: the personal nature of emotions, facial recognition in supermarkets for dynamic pricing purposes, and the interpretation of customer satisfaction on the basis of biological data.
Determine emotions through facial recognition
The method is not new and has been implemented in many environments. However, it is a matter of debate.
Facial recognition makes it possible to identify individuals (you may have experienced this at airport checkpoints) at the risk of totalitarian abuses (the case of China, which wants to “record” its citizens and note). While users may sometimes submit to this technology in full knowledge of the facts (e.g. border controls), their use may also be more insidious. We have discussed the case of intelligent supermarket shelves, which raises the ethical question as to whether prices would vary according to the consumer in front of them (keep in mind however that the concept of a single price is not so recent as we mentioned in the article on Frank Woolworth), or legal questions when content is personalised without a person knowing (the now famous example of advertising that adapts to your emotions).
Whatever the case of use, it is essential to inform the people who may be subject to such automated analyses and to question the real contribution of the technology used for the end-user. I want to repeat once again that technology must be used wisely and must first and foremost contribute to increasing customer satisfaction. So, let’s come to the case we’re interested in today: the transition from facial recognition to customer satisfaction.
Facial recognition for customer satisfaction
In his interview, Paul Hermelin mentioned a practice that I had never heard of before. Facial recognition would be used to detect the emotions of cruise ship passengers and thus interpret their level of satisfaction. He even mentioned the possibility of changing menus and adapting activities in case of a detected drop in customer satisfaction.
Measuring facial emotions
Let’s take a look at the first part of the system: the transition from an emotion measured on a face to the evaluation of satisfaction. First of all, let’s note that our entire body is a reflection of our emotions. Our body is governed by biological algorithms that determine how our body reacts, especially the facial expressions. There is, therefore, nothing incongruous about wanting to measure emotions on a face and there are many specialised algorithms for this purpose. To convince yourself of this, I invite you to read this article or watch the video below. Mentalists are masters at reading and interpreting bodily reactions, so I do not doubt a computer’s ability to do the same job. But, how can we move from an emotion to a measure of satisfaction?
Transforming the assessment of emotion into a measurement of customer satisfaction
Everything gets complicated when you have to translate an emotion into a level of customer satisfaction. I have the most significant doubts about the very feasibility of this idea.
Emotions are indeed fleeting events by nature. They can, of course, be interpreted easily for specific very recognisable facial movements such as those that were the subject of the work of the Austrian sculptor Franz Xaver Messerschmidt. But in most cases, emotions are reflected on our faces in the form of movements that are difficult to perceive: a jaw that tightens, a muscle that stretches, an eyebrow that rises. If an algorithm exists that will make it possible to automate the interpretation of all facial micromovements, its use in an environment that is not that of a laboratory seems to me to be compromised for at least two reasons: resolution and calculation time.
A sculpture by Franz Xaver Messerschmidt (credits : Flickr / Jerzy Kociatkiewicz )
Detecting micromovements requires that the images analysed be of particularly good quality and therefore of high resolution. I let you imagine what this would mean for an environment such as a ship with hundreds or even thousands of people onboard. Then it would require significant computing power to analyse all these faces and their movements. But that’s not all. I still have two more compelling arguments that lead me to say that Paul Hermelin’s comments are more a matter of foresight than reality.
First of all, our faces are animated continuously by micromovements that do not necessarily reflect a state of satisfaction/dissatisfaction. On the scale of a computer system, this is called noise. The presence of too much noise calls into question the very feasibility of the system. The second and not the least argument, emotions being fleeting, how could a system, however sophisticated it may be, capture the moment among so many others during which the sentiment expressed is genuinely a reflection of satisfaction/dissatisfaction? The key to the correlation between satisfaction and feeling rests here.
Conclusion: It is not possible to design a reliable system that links emotions to customer satisfaction
To summarise, here are the four arguments that lead me to state that Paul Hermelin oversold what Cap Gemini might have done.
- Emotions are expressed on the face mainly in the form of micro-movements whose interpretation is currently not available in the form of algorithms. The only algorithms used to analyse emotions are limited to recognising easily recognisable movements (a broad smile for example)
- The recognition of facial micromovements would require very high-resolution images, which is possible in the laboratory but highly unlikely at present in conventional working environments.
- The analysis of images in very high resolution (especially for the hundreds or even thousands of passengers on a ship) would require excessively high computing power
- The transience of emotions would make it difficult to interpret them accurately in a context that can lead to satisfaction or dissatisfaction. Most of the images analysed would, therefore, constitute noise, which would make the system inefficient.