Profound implications for the advertising business
A university in Bangladesh has built an algorithm that can, with significant accuracy (around 85%+), recognise the emotional state of a computer user.
The study combined – for the first time – two established ways of detecting user emotions: keystroke dynamics and text-pattern analysis (again – the cornerstone of Artificial Intelligence).
From Science Daily magazine: “Writing in the journal Behaviour & Information Technology, A.F.M. Nazmul Haque Nahin and his colleagues describe how their study combined keystroke dynamics and text-pattern analysis, which are established ways of detecting user emotions.
To provide data for the study, volunteers were asked to note their emotional state after typing passages of fixed text, as well as at regular intervals during their regular (‘free text’) computer use. This provided the researchers with data about keystroke attributes associated with seven emotional states – joy, fear, anger, sadness, disgust, shame and guilt. To help them analyse sample texts, the researchers made use of a standard database of words and sentences associated with the same seven emotional states.
After running a variety of tests, the researchers found that their new ‘combined’ results were better than their separate results. What’s more, the ‘combined’ approach improved performance for five of the seven categories of emotion. Joy (87%) and anger (81%) had the highest rates of accuracy.”
This has major positive implications for the advertising industry (but negative ones for TV advertising). If users allowed an iPad camera to monitor their emotions (say an app that makes you laugh or happy), then it would allow an advertiser to display ads only when the emotional state is ideal for the message to be received. However this is not positive news for the current state of TV advertisers when ads are shown irrespective of content, context and emotional state. . . making it less effective than content, context and emotional context focused ads.