Verbal responses often fail to elicit the true nature of emotions. Consumers find it hard to verbalize emotions, and they may not even be conscious of their existence. Moreover, some emotions are personal and perhaps embarrassing to express aloud. There are also subjectivities in languages and cultures. In some countries for instance, people are less willing to express negative opinions. For these reasons, the focus on emotion research is shifting to indirect interviewing methods and non-verbal approaches.
The approach adopted by Ipsos ASI makes use of Emoti*ScapeTM, a show card containing a map of 40 emotions (see Exhibit 23.10), each represented by an illustration of a facial expression (emoticons) and a verbal description of the feeling expressed. Respondents are asked to indicate where on this emotions landscape best represents their feelings for each of the following:
Their responses are summarized to reflect the emotional reactions and responses to the advertisement and the brand.
There are also a number of physiological methods of testing advertisements that measure the respondents’ involuntary reactions to stimuli. For instance, research on autism at the MIT Media Lab and the University of Cambridge has spawned non-verbal technologies to measure emotional response.
One of these is a wearable, wireless biosensor that measures emotional arousal via skin conductance (aka galvanic skin response). It is based on the fact that electrodermal activity grows higher during states such as excitement, attention or anxiety and lower during states such as boredom or relaxation.
Emotional states such as liking and attention may also be gauged from facial expressions. Neuromarketers have accordingly developed tools that read facial expressions using a webcam to give insights into consumers’ response to advertising.
The development of portable, wireless electroencephalogram (EEG) (see Exhibit 23.11) scanners has enabled neuroscientists to gain insights into how the mind responds to stimuli. Sensors covering the surface area of the brain can capture synaptic (brain) waves, and amplify and dispatch them to a remote computer. The resulting streams of data reveal participants’ subconscious responses to advertisements, and capture their emotion, attention and memory retention during the course of the commercial.
EEGs are appropriate for capturing signals about attention, arousal, fatigue and surprise, which are emitted from the brain’s surface. They are not as effective in picking up signals from deeper within the brain, that are key for decision making. EEGs therefore are better suited for testing feelings and emotions, and not appropriate for testing informational ads or commercials that require thinking.
The need for controlled location/laboratory environment makes EEGs somewhat restrictive and removed from the natural settings in which advertisements are processed. Even so the use of EEGs is picking up as equipment costs decline.
In recent years we have seen rapid rise in the development and adoption of the non-conventional analytics techniques described here. A host of organizations, many relatively new, have sprung up building devices for eye tracking, EEG, facial expressions and biosensors. Other organizations such as iMotions, have developed solutions that lets researchers integrate these technologies as well as surveys, under a single unified software platform.
Marketers are increasingly using a blend of eye tracking, EEG, facial coding, and biometric, in combination with conventional quantitative/qualitative research studies, to evaluate the effectiveness of advertising and other elements of the mix, including packaging and product usage.
Collectively these techniques provide a rich source of diverse metrics. While biometric measures reveal the extremes of emotional engagement, EEG devices provide for a more granular understanding of respondents’ emotions, thoughts and motivations, and facial coding helps to interpret their facial expressions. When combined with eye trackers, marketers are able to identify the specific elements of the ad/stimulus that capture consumers’ attention and trigger their responses.
Details about eye tracking, EEG, facial coding and biometric devices are provided Chapter Biometrics.
Note: To find content on MarketingMind type the acronym ‘MM’ followed by your query into the search bar. For example, if you enter ‘mm consumer analytics’ into Chrome’s search bar, relevant pages from MarketingMind will appear in Google’s result pages.
Marketing has changed. More so in practical terms, and marketing education is lagging.
The fundamental change lies in the application of analytics and research. Every aspect of the marketing mix can be sensed, tracked and measured.
That does not mean that marketers need to become expert statisticians. We don't need to learn to develop marketing mix models or create perceptual maps. But we should be able to understand and interpret them.
MarketingMind helps. But the real challenge lies in developing expertise in the interpretation and the application of market intelligence.
The Destiny market simulator was developed in response to this challenge. Traversing business years within days, it imparts a concentrated dose of analytics-based strategic marketing experiences.
Like fighter pilots, marketers too can be trained with combat simulators that authentically reflect market realities.
But be careful. There are plenty of toys that masquerade as simulators.
Destiny is unique. It is an authentic FMCG (CPG) market simulator that accurately imitates the way consumers shop, and replicates the reports and information that marketers use at leading consumer marketing firms.
While in a classroom setting you are pitted against others, as an independent learner, you get to play against the computer. Either way you learn to implement effective marketing strategies, develop an understanding of what drives store choice and brand choice, and become proficient in the use of market knowledge and financial data for day-to-day business decisions.