Back at the tail-end of 2016, we posted about how emotion analytics has changed lives through its use in advertising, gaming, market research and video chat, along with other areas. Technology capable of recognising human emotions through their facial expressions has evolved since then, as have its possible applications. Here, we take a look at a few potential uses for the technology that could be just around the corner...

Healthcare

Technology's ability to recognise and categorise emotions is yet to surpass that of the human brain. However, there are certain scenarios within which an individual's capacity for this important everyday function might be impaired – and that is an area where this burgeoning technology can step in.

Conditions that can affect a person's ability to read emotions include Parkinson's, Borderline Personality Disorder (BPD, also known in the UK as Emotionally Unstable Personality Disorder) and schizophrenia. For people suffering from one of these, a technological application could perhaps emerge that would assist them in recognising and responding appropriately to the emotions of others.

Another area where this tech could be valuable is mental health. For instance, emotion analysis technology has the capacity to support individuals suffering from depression or other mental health issues by reading their present emotional state and reporting on their ups and downs to close friends and family in their support network, or by suggesting more time-specific treatments or courses of action based on their emotions as identified in the moment.

Mobile

Manufacturers of mobile devices are constantly expanding the usefulness of their products with new features and capabilities. An important element of this comes in the form of sensors that detect and record data on a user's behaviour or actions, such as their movement, proximity to the phone or location. Emotion recognition could well become the next sensor to be built into every smartphone on the market – and developments such as the iPhone X's 'animojis' (emojis that respond to a user's facial expressions) suggest that this moment is not too far away.

The range of ways in which this information could potentially be used is near limitless. Aside from the healthcare-related uses mentioned above, reading a user's emotional state could help apps to provide immediate feedback, including seeking clarity when they're angry, offering encouragement when they're feeling down, celebrating with them when they're excited, asking if they need help when they look confused or otherwise adapting the app's functionality to suit their mood. Some more specific ways in which such technology could capitalise on a user's emotional state are explored below.

Social media

Emotion analysis in mobile phones could make it even quicker and easier to post on social media. It's not all that long ago that Facebook moved on from having just a simple like button to allowing users the option to select from a number of emotional reactions – imagine if rather than having to painstakingly select a reaction to a post, users automatically received a suggestion for how to react based on the expressions on their face while viewing it? Or how about a suggestion for a post or tweet in response to a felt emotion?

Market research

Research to improve the success rates of advertising campaigns, including measuring viewers' emotional reactions to an ad to see how effective they are, has been commonplace for years now – these days, automatic emotion recognition technology makes this process quicker, simpler and potentially more accurate than asking participants themselves how they felt about an ad. This kind of video research has been used by a variety of brands including Kellogg's, who tested various versions of an ad among a sample audience to see which one got the best response before rolling out the most successful version.

With emotion recognition likely to become a prevalent feature in smartphones, advertisers and app developers may be able to take advantage of this by measuring a user's mood while they use an app, or recording their responses to in-app advertising, enabling them to tailor the app or ad content accordingly, in real time. The technology would also allow for the collection of feedback on how the user base responds to certain ads or app features.

Developments in emotion analysis technology are providing exciting business opportunities in many areas. Here at Plotto, we use it in our video market research application to identify how a respondent feels while giving their answers to a survey. By getting clarity on respondents' emotional reactions, researchers can be certain that they feel the way they say they do, or gain insight into the bigger picture of what a respondent is feeling beyond the words they're saying. If you'd like to know more about how a Plotto video survey can improve the effectiveness of your market research, head over to www.plotto.com.

Comment