How do you feel about this. . .
Technologists have always upheld that, no matter how capable technology is, it can never offer the same insight or experience as a human. When it comes to really engaging with emotion, it’s often asserted that robots and interactive software can never come close to human conversation. However, with access to mass information and the ability to recognise facial and vocal changes, machines are edging closer to an astute understanding of human expression. Whilst it’s true that technology will never be able to deliver the same emotional intelligence as a person can, the gap is narrowing. Robotic companions, Artificial Intelligence software and chatbots are bringing robot-to-human interaction into mainstream culture, and in many cases it’s already very hard to tell the difference between an automated assistant and a regular operative. A number of innovative technologies can now claim to recognise and respond to emotions, but how do they do this? What implications does it have for the use of technology in wider society?
How does technology understand emotions?
Developers are exploring various ways to programme technology to recognise emotions. Social robots like Pepper can read emotions via tone of voice, choice of words and facial expressions, which it then responds to in an appropriate way. If Pepper detects that its owner is sad, it will suggest a game or tell a joke. Similarly, a smart toy dinosaur called Dino has been developed by CogniToys to know when a child is experiencing fear or anxiety. The toy is powered by AI, but is described as emotionally intelligent. Taking AI out of the toybox and into the courtroom, an artificially intelligent system developed by UCL, the University of Sheffield and the University of Pennsylvania predicted judicial decisions in human rights trials to an accuracy level of 79%. Another AI system called MogAI used societal engagement data to successfully predict the outcome of the US election, which most would agree is an incredibly emotional event. It’s not just that technology can understand human emotion, but it can now offer what could be called an ‘emotional’ response. Add increasing natural speech capabilities to realistic facial expressions, and you’re looking at a very believable humanoid that can have an engaging, two-way conversation. Whilst tech might not be able to actually experience feelings, it can certainly comprehend them.
How disruptive is emotional technology?
The ability to express and understand emotion is perhaps the biggest differentiating factor between us and machines. Once this barrier is removed, it will be increasingly difficult to set people apart from realistic social robots. Early this year, Nanyang Technological University (NTU) revealed Nadine, a smart, humanoid robot that looks people in the eye as she speaks to them and remembers previous conversations. The robot is modelled on Professor Nadia Thalmann, with striking resemblance. Nadine now works as a receptionist at the university. Add a few more years of R&D, and you’ve got an ultra-realistic clone that could quite easily pass for a living, breathing human being. On the one hand this is terrifying, but on the other there are some really useful applications worth considering within the public sector. Imagine using AI or social robots to deal with suspected criminals in examination rooms, gathering a logical picture of the events of a situation and matching them to testimonies. Add lie detecting capabilities, and that’s a pretty useful detective. There’s also potential for the creation of more smart toys like Dino, which enrich play by responding to how a child actually feels. Could we even see robotic therapists or doctors, dealing objectively with sensitive material? From a business angle, emotionally intelligence tech is all about bringing a higher quality of customer service – imagine social robots as novel receptionists or sales team members. Companies who develop technology with emotion recognition will offer a far more engaging service for their clients, whether they own a domestic social robot or chat to a particularly understanding AI on the shop floor.
The obvious problem. . .
Tech that can understand emotion does have a place in society as helpful assistants with a slightly more tactful approach than their blunt predecessors. However, there’s an obvious problem. How are we going to tell the difference between humans and non-humans if they look like us, sound like us and can engage with us on an emotional level? Not only is this going to be incredibly unsettling, but there’s potential for things to get even weirder. If you’re not sure what this is referring to, go and watch the film Her. In a world increasingly populated by humanoids, society will need to find a way to draw a line between ‘us’ and ‘them’. It may seem like jumping the gun to envisage a world where the person working alongside you is actually an emotionally intelligent robot. Indeed the most realistic humanoids are rare, advanced prototypes – but they exist none-the-less, and researchers are working to extend their capabilities.
Whether or not emotionally intelligent technology looks like Nadia Thalmann or a green plastic dinosaur, systems that understand human expression have already disrupted law, retail and hospitality. There’s huge scope for emotional recognition in the public sector, from impartial therapists to understanding receptionists. Whilst technology may be able to recognise and respond to certain emotions, it’s another question entirely as to whether it will ever be able to really comprehend them. Can a robot really be compassionate? Are they simply programmed to react to certain triggers in human behaviour? The answer is almost irrelevant for businesses – the merit in emotionally intelligent technology comes from how it can enhance their strategies. SoftBank Robotics, for instance, will benefit massively from developing models that truly seem to demonstrate empathy. The real question is, will consumers respond well to technology that understands how they’re feeling, or is emotional indifference actually one of its greatest strengths?
Could your business benefit from the application of emotionally intelligent software? Will society be comfortable interacting with ‘emotional’ technology? Which other sectors could use emotionally intelligent software to their advantage? Share your thoughts and opinions.