Emotient: Facial Expression Recognition "Trumps" All

By The Myers Report Archives
Cover image for  article: Emotient: Facial Expression Recognition "Trumps" All

Emotional testing, neuroscience and community based measurement are expanding the boundaries of traditional research, including political research, by offering new, insightful tools to predict audience and consumer behavior. A particularly interesting example is the work being done by Emotient, a three-year old San Diego based software company focused on the emotional measurement space. Their position: emotions drive spending. By focusing on facial expression recognition and registering an emotion via a facial expression, Emotient predicts the emotional resonance of any visual, including advertising messages, talent, video content, and even a political candidate.

According to the measurement of audience reactions through the first Republican debate on Fox News, Emotient projected that Donald Trump would be, by far, the winner. Trump elicited by far the strongest emotions: people laughed when he wanted them to laugh; they became angry when he wanted them to be angry; and were fearful when he tried to scare them. Emotient will be tracking results throughout the debate season.

There are several companies using biometrics to help marketers ascertain the non-verbal and unstated impact of their content. Tools like Galvanic Skin Response (GSR) which measures changes in the skin and Functional Magnetic Resonance Imaging (FMRI) which measures brain activity through the blood flow, as touted as good methods to measure subconscious consumer response. But according to Marni Bartlett, Ph.D., Co-Founder and Lead Scientist of Emotient, GSR data is noisy and slow while FMRI is slow and cumbersome.

Emotient's IP has the ability to capture hundreds of facial expressions at one time in a non-intrusive manner which, they say, offers the capability to amass big data by individuals' gender and age. Emotient can track live responses at conferences, sporting events, concerts and business meetings. There are nine basic emotions, each or in combination offering an insight into the desirability (or lack thereof) of the messaging and creative, and twenty more facial muscle movements that can be captured. Once participants opt-in, Emotient's technology can be used in nearly any setting or venue to capture facial expressions, individually or in groups.  Imagine gathering data via a tablet or smartphone or in a car with a sensor pointed at the driver.

The result is an ability to scale and to create a range of baseline measurements based on a specific creative or product type, such as automobile ads or financial services.  These results can be further segmented by demographics such as gender and age. Joshua Susskind, Emotient's Co-Founder, Lead Deep Learning Scientist, explained that via their methodology, P&G's Tide detergent was able to more accurately predict intent to purchase across three different types of detergent fragrances using facial expression recognition while the traditional survey the client fielded at the same time, could not.

This certainly makes a compelling argument for the expanded use of facial expression recognition to help move the media needle. Such methodology might become a tool for advertising creatives enabling them to place ads in program content with the greatest emotional synergy. It can also be used to fine tune a message so it better resonates with a desired target audience. For instance, a fast food company whose commercial angers women may want to tweak their message for a more engaging and positive response from them or place their ad in more male dominant media content. Interestingly, not all negative emotions are bad for messaging, according to Susskind. An ad that elicits "Disgust" may be perfect for detergent but not for food.

But the real question is, if actual cause and effect can be measured through the scalable measurement of emotional reactions, do we need to rely on our current, historical data-centric stats that are less nuanced and predictive? Isn't it time to integrate these new neuroscience-based measurement capabilities and approaches into our current measurement toolkit?

My opinion is that, as an industry, media has been historically wedded to delivery metrics such as ratings and GRPs. These measurements have been embedded in the system since inception. While new technology and neuroscience applications are certainly exciting and can produce impactful results, they are likely to be incorporated as media industry currency when they are integrated in the marketing cloud with other big data resources, such as set-top box data and credit card purchase data. Emotionally-based data solutions are more likely consultative, supportive measurement tools. For creative research, video content evaluation, talent assessment, and political insights, however, they could quickly become standard measures.

The opinions and points of view expressed in this commentary are exclusively the views of the author and do not necessarily represent the views of MediaVillage.com / MyersBizNet, Inc. management or associated bloggers.

Copyright ©2024 MediaVillage, Inc. All rights reserved. By using this site you agree to the Terms of Use and Privacy Policy.