Emotional marketing: Elicit emotions in order to stand out from the competition

7
minute read
Blog

Emotions build memories and drive action, which is why they hold a special place in advertising. Brands rely on emotional marketing to gain a competitive advantage within each step of the consumer journey – it drives the brand’s audience to notice, remember, share and, ultimately, buy the product or service.

And although there is no doubt that successful ads create an emotional connection with the viewers by relying on emotional marketing, the emotional appeal is only one part of that success. Ads should also be relevant and bring a sense of novelty (both of which lead to higher attention and memorability). So, to deem an ad successful, we need all parts of the pie, which we can easily achieve by ensuring that survey, RTM, eye tracking and retention are combined with emotion recognition.

So, here’s your guide to everything you need to know about emotion recognition!

Anchoring: Associating emotions with the brand

Besides complying with consumer journey stage-specific objectives, advertising has an opportunity to anchor positive emotions to the brand. Anchoring goes beyond the first impression bias; it is stable and fluid enough to be both relevant and manageable. Some of the biggest brands’ primary principles in advertising focus on selling abstract positive concepts such as happiness, excitement, and security while, simultaneously, training our brains to associate the brand with those emotions. Nevertheless, anchoring can work both for, or against, the brand so it’s important to be aware of it and know how to leverage it. Ultimately, in order to correctly plant an anchor, advertisers need to make sure they know which emotions are evoked by the ad and in what particular context the emotion of interest peaks.

Explicit vs. implicit research methods: Stated input alone isn’t enough

Explicit vs. implicit methods. When evaluating people’s preferences and judgment on some specific content, the most reliable way to do that is by observing what they do (how they behave), instead of relying solely on what they say. Getting feedback is never an easy task for marketers, especially when it comes to the target group’s attitudes and beliefs – let alone emotions. Respondents may not be able or willing to share their likes and dislikes; moreover, they are unlikely to remember their emotional reaction to the stimulus after the exposure or even be aware of it during the exposure. Furthermore, likes and dislikes are intrinsically emotional concepts which is why an emotion-centric method is the best fit here. In line with that concept, emotion recognition proxy solutions like “emoji analysis” are just as misleading as a verbal response, as they still depend on the respondent’s willingness and capacity to share their opinion – they are symbolic rather than verbal, but still remain explicit and voluntary in nature. Besides having a subjective experience (1) and physiological dimension (2), emotions encompass a behavioral component (3) too. Behavioral component is mostly automatic and involuntary;this is especially true when respondents are tested individually/isolated from social settings because, when grouped, and particularly if an authority figure is present, people are more likely to manage or mute some of their facial expressions.  



EyeSee’s study prerequisites (i.e. remote testing, FC and ET calibration that allows for only one person at a time) make respondent participation an individual experience thus allowing automatic and involuntary nature of facial expressions to take place. This results in emotion recognition being the most objective method when it comes to assessing the audience’s emotional response to advertising efforts. And though there are other implicit methods that yield insight into emotional experience (like EEG and fMRI), emotion recognition is undoubtedly the most scalable (tests are conducted online) and the least intrusive one when it comes to marketing research (there are no wires attached, respondents sit comfortably at their homes during the test).  In order to participate in emotion recognition study, respondents are required to provide emotion recognition consent, and according to EyeSee’s internal research, there is no statistically significant difference within the respondents’ demographics (age, gender, ethnicity, education level, income, employment and marital status) between those who consent to emotion recognition and those who do not, the fact that further ensures the feasibility of the solicited sample distribution for our client’s studies.

How facial expressions are scored: FACS

We are all facial coders – as species, we rely on one another for survival and wellbeing, and it is very important that we can quickly look at someone’s face and evaluate how they feel so we can navigate our interactions with them. But how can we come to a consensus that we are talking about the same expressions and identifying a smile as a smile? In order to standardize descriptions of facial muscle movements, Paul Ekman constructed the Facial Action Coding System (FACS), a tool for the classification of all possible facial expressions that humans can have. In this comprehensive system, each component of facial movement is called an action unit and all facial expressions can be broken down into action units. Together with W. Friesen, he identified 46 different units able to describe each facial movement.

EyeSee’s emotional framework, which relies on Paul Ekman´s FACS[1], is comprised of 5 basic and 2 cognitive expressions:

1. Basic emotional expressions: happiness, surprise, dislike (an umbrella category for disgust, contempt, and anger), fear, sadness

2. Cognitive emotional expressions: interest, confusion

Besides basic emotional expressions (happiness, surprise, disgust, contempt, fear, sadness, and anger), which are commonly found in behavioral marketing research companies which utilize the emotion recognition method, EyeSee’s framework also includes cognitive emotions (interest and confusion). Cognitive emotions are an important addition when it comes to understanding how attention is managed and knowing how to optimize it during the ad. Interest is a cognitive emotion that inspires learning and engaging with new things (stimuli and experiences) and willingness to further explore the product or service. Overall, it positively influences attention given to the content, whereas confusion mainly has the opposite effect – people withdraw their attention and “decide” to spend their brain power on something else. Finding something understandable, however, is the hinge between interest and confusion – new/complex and comprehensible things are interesting; new/complex and incomprehensible things are confusing (Paul J. Silvia).

Transforming Emotion Recognition data into insights

Emotional experience behind the emotional expressions is a level to be tackled in the interpretation phase, by combining facial expression and stimulus, or, in other words, when providing facial expression with a context. In context, happiness can be interpreted as relief or sympathy, surprise as awe, contempt as disbelief, and disgust as discomfort; some expressions, like a frown, can even be neglected in the interpretation phase if its very peak corresponds with a scene with excessive light shift in the stimulus. Supported by survey analysis and eye tracking, we can further match the nuance behind the labels.

Which emotions is the creative team after? Intended emotional outcome and its repercussions on interpretation

Remember Burger King’s “Mouldy Whopper” ad? It’s a bold commercial designed to provoke disgust in order to make a point while at the same time skyrocketing its memorability: the burgers rot over time because they are organic, and organic means fresh – no artificial preservatives. Depending on the ad’s narrative, negative emotions like disgust, contempt, sadness, and fear can be a part of creative objectives. While it may seem counterintuitive to intentionally trigger these emotions, they can be powerful drivers of action, especially if your product or service offers a solution to them. This is why valence-based approaches, which limit themselves to positive and negative emotion-oriented KPIs, are to be taken with reserve; they are less explanatory when used alone without emotional profiles since they do not take into account that emotions of the same valence differ in essential ways. Furthermore, “roller coaster” narratives demand negative emotions for the positive emotions to take place – e.g. In order to elicit relief, you first need to elicit fear. It comes down to this: interpretation greatly depends on the emotional marketing strategy behind the creative’s narrative.

Benchmarks – Compare apples to apples

When assessing an ad’s performance on an emotional level, providing relevant benchmarks is a must. Stimuli of different lengths attain different levels of attention and allow for different emotional engagement to be developed. The same rule applies to creative solutions (e.g., is the stimulus’s creative solution more a collage or movielike?). And although a global benchmark is a good fit in many cases (with the right lengths and stimulus solution), for some studies it is best to apply a culture-specific benchmark (i. e. for a study conducted in Japan, an “Asian benchmark” would be the best fit). In conclusion, it is essential to consider both sample and stimuli specifications when conducting significance testing and an extensive benchmark base allows EyeSee’s research team to do that, thus reaching reliable conclusions.

Afterthoughts

Behind some of the most successful brands is an emotional marketing strategy consistently conditioning our brains to associate the brand with positive emotions. These associations play a particularly important role when rational discriminative factors like price and product specifications are missing (i.e. when there is no apparent difference between price and product/service specifications among competitor brands). Being both the least intrusive and the most scalable method, emotion recognition serves as a window to honest feedback on emotional engagement, thus allowing creative teams to answer the following questions:

  1. Does the message / visual / narrative resonate with target audience on the emotional level?
  2. Is the content creating the intended emotional state?
  3. Which are the most engaging parts of the stimulus?
  4. Where are the emotional peaks?
  5. Based on 1-4, what are the learnings and how can they be leveraged to design narratives that bring brands closer to their audience?

Are you looking for more methodology knowledge? Check you this blog about retention measuring!

[1] The Facial Action Coding System (FACS) is a comprehensive, anatomically based system for describing all visually discernible facial movement. It breaks down facial expressions into individual components of muscle movement, called Action Units (AUs); Ekman, Paul; Friesen, Wallace, FACS, 1978.

Vanja Radic
Emotion recognition team lead at EyeSee
Tags
Behavioral insight
Innovation
Limited offer:
Reckitt x EyeSee
@ESOMAR Retail Media webinar on demand
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Please fill the form to download publication
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Latest blogs
Go to Blog
Small white arrow icon directed toward north-east
Available for collaboration
How can we help?
Eyesee people
Small white arrow icon directed toward north-east
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
white x icon on a larger black filled circle