Artificial Emotional Intelligence

How are you feeling? AI can tell…

You wouldn’t have thought it, but AI is getting emotional. Admittedly, artificially intelligent technology can’t understand emotions in the same way as humans, but it’s edging closer and closer. In fact, in an experiment conducted at The Ohio State University, even a basic AI was able to recognise different emotions more accurately than human participants. The study of systems that can recognise, interpret and even recreate human emotion is known as affective computing. By 2022, the market is expected to be worth $41bn. But can technology really be emotionally intelligent, and what are the opportunities and pitfalls of this relatively novel industry?

You might not understand AI, but AI understands you

Emotions are one of the cornerstones of humanity, shared it seems by some animals and not machines… or so we thought. Amazon‘s Alexa is already detecting sentiment to target advertising and suggestions, and social robots like Buddy and Pepper are trained to interact with humans on an emotional level. As GAFA and many other companies battle to win the AI race, facial and voice recognition is becoming more accurate than ever. As a result, technology is getting incrementally better at understanding humans.

Across industries, artificial emotional intelligence can work in a number of ways. For example, AI can monitor a user’s emotions and analyse them to achieve a certain outcome. This could include tracking stress levels in the workplace, or measuring emotions during cognitive behavioural therapy. AI can also use emotional readings as part of decision making, for example in marketing campaigns. As early as 2015, an AI poster campaign launched by M&C Saatchi recognised consumer emotions by using a camera and a genetic algorithm. The advertisement then changed based on the consumer’s reaction. Another way in which affective computing can work is by mimicking or replacing human emotions and interactions. Applications include social robots, and AI assistants in industries like healthcare and hospitality. Each of these applications has clear business advantages, moving closer to consumers and understanding them on a deeper level.

Replacing IQ with EQ

We’ve come to accept that disruptive technology is going to take over manual, repetitive, and administrative roles. What we’re less accustomed to is the idea that AI could also do jobs that require a personal touch – think carers, medical staff, therapists, and even teachers. If AI can demonstrate high quality emotional intelligence, then these professions are at just as much risk of automation. One of the key retorts to the worry that ‘AI will take everyone’s jobs’ is that humans will take on more emotional roles. If those roles are also gobbled up by intelligent tech, what’s left for us?

Of course, this is all hypothetical. For now it’s safe to say that AI is still getting its artificial brain around the nuances of human emotion. Even if AI can reliably recognise complicated feelings, the obstacles go beyond technological ability. Our lives are already tracked and analysed by the organisations that gather our financial, social, security and health data. Trying to add emotional data to that list could be seen as prying or manipulative. There’s also something unsettling about replacing human emotional roles with artificial emotional intelligence. You might use AI to schedule your weekly meetings, but would you happily leave your ageing mother alone with an artificially intelligent robot carer?

Queen Elizabeth I wisely refused to ‘make windows into the souls of men,’ but through artificial emotional intelligence, the likes of Google, Microsoft and Amazon are attempting to do just that. However, before including artificial emotional intelligence in their products, services, and systems, companies need to make sure that consumers are fully aware and that they have given their consent. There’s also the ongoing concern that the technology could be inaccurate, and lead AI to make some interesting conclusions. This is not too serious from a marketing perspective, as it wouldn’t do much harm except send an ad for something you would never want to buy. But if a dodgy reading was taken by AI in a therapy session or a hospital ward, then the results could be disastrous. Artificial emotional intelligence should be used with great caution and in conjunction with emotionally intelligent humans. Given the AI talent gap, perhaps finding those people will be the biggest challenge of all.

To read more about the advance of Artificial Intelligence, sign up for our free newsletter.