Computers have always been good at doing fast calculations, but adapting to the emotional state of the person using the computer – now there is a grand challenge! The field is called affective computing, and soon it will be an important factor in the way people and computers communicate with each other.
The computer will interpret your body language to determine how you are feeling and then tailor its response intuitively, just as we do with each other. What’s more, we will like it because it is far more intuitive than the keyboard, mouse and touch screen as an input method.
As a way of communicating, emotion is very old indeed. Long before humans invented spoken language we communicated non-verbally at an emotional level. It is still the principal way that we get information from each other, with around 70% of a message’s content being conveyed by body language, about 20% by tone of voice and only 10% by words.
This is why we are able to instinctively recognise people’s emotional state. Wherever you are in the world, you can tell when a stranger is angry or happy or sad. Indeed it is such an ancient way that many people can accurately gauge how animals are feeling based on their body language.
Affective computing allows humans and computers to go beyond keyboards and use these rich, non-verbal channels of communication to good effect.
How can computers know our emotions?
Computers will read our emotions by much the same process that humans do. It begins by connecting an array of sensors (cameras, microphones, skin conductivity devices) to a computer that gathers varied information about facial expression, posture, gesture, tone of voice and more.
Advanced software then processes the data, and by referencing a database of known patterns it is able to categorise what it is picking up from the sensors. The pattern might match for a range of emotions such as angry, disgusted, afraid, happy, sad, surprised, amused, contemptuous, contented, embarrassed, excited, guilty, proud of an achievement, relieved, satisfied, sensory pleasure or ashamed.
The system uses a feedback loop to learn and improve. If it is connected to other systems, what one system learns can be learned by all.
Here’s where it gets scary for some. With ageing populations and more people living alone, there is rising demand for companions and helpers at home and work to perform tasks that people are reluctant to do.
This need will increasingly be met by anthropomorphic artificial intelligence – functional robots that look and behave like humans. Over time, these will become increasingly life-like because people like projecting human qualities on the things we live with.
While still in the realm of science fiction, with so much effort going into their creation, a world in which people live and interact with humanoid robots is not far off.
This theme is explored in the 2015 television series Humans, based on the 2012 Swedish series Äkta Människor (Real Humans), currently broadcast on the ABC.
All kinds of applications
Emotion monitoring is already working in systems that we use every day, and we will see more of it soon. Are you tempted to send an angry email? Your computer will be able to detect your emotion as you type.
At home, knowing your habits and current mood, such systems could adjust the lights, music and room temperature to create the most pleasant ambient conditions. It could suggest entertainment options; any number of things.
At school, the mode of on-line delivery could be optimised based on the mood of the student; bored, interested, frustrated, pleased.
At the doctors, it could help with diagnosis and offer a way for people with autism to communicate.
In public, it notices when people are likely to do something such as smash a window, harass someone or start a fight. Likewise at a sporting event when a crowd looks like turning into an riot. In airports it identifies people who might be carrying a bomb or smuggling contraband.
In shops, retailers can use it to know which shoppers need help, who is just browsing and who is thinking about stealing something, all by their body language. Indeed, known shoplifters could be recognised at the door and prevented from entering the store.
In eCommerce, sellers can gauge consumer reaction when they read an ad or use a product.
Should I be worried?
These developments are likely to worry people concerned about privacy and personal liberty. These legitimate concerns must be weighed against the greater public interest and a balance found case-by-case. Privacy laws will need to be strong and to keep pace with developments.
We should understand that such systems are not conscious in the way humans are. They simply interpret the patterns of behaviour that people show the world and make communicating with computers more intuitive.
They will make mistakes, learn from those mistakes and get better over time, just like humans do. Of course, it may be sometime before they can deal with sarcasm and irony. You won’t see too many robots doing stand-up comedy quite yet.
David Tuffley, Lecturer in Applied Ethics and Socio-Technical Studies , Griffith University
This article was originally published on The Conversation. Read the original article.
COMMENTS
SmartCompany is committed to hosting lively discussions. Help us keep the conversation useful, interesting and welcoming. We aim to publish comments quickly in the interest of promoting robust conversation, but we’re a small team and we deploy filters to protect against legal risk. Occasionally your comment may be held up while it is being reviewed, but we’re working as fast as we can to keep the conversation rolling.
The SmartCompany comment section is members-only content. Please subscribe to leave a comment.
The SmartCompany comment section is members-only content. Please login to leave a comment.