The emotional computer


Cambridge University film provides a glimpse of how robots and humans could interact in the future.

Can computers understand emotions? Can computers express emotions? Can they feel emotions? The latest video from the University of Cambridge shows how emotions can be used to improve interaction between humans and computers.

When people talk to each other, they express their feelings through facial expressions, tone of voice and body postures. They even do this when they are interacting with machines. These hidden signals are an important part of human communication, but computers ignore them.

Professor Peter Robinson is leading a team in the Computer Laboratory at the University of Cambridge who are exploring the role of emotions in human-computer interaction. His research is examined in the film The Emotional Computer.[youtube whCJ4NLUSB8 nolink]

\”We\’re building emotionally intelligent computers, ones that can read my mind and know how I feel,\” Professor Robinson says. \”Computers are really good at understanding what someone is typing or even saying. But they need to understand not just what I\’m saying, but how I\’m saying it.\”

The research team is collaborating closely with Professor Simon Baron-Cohen\’s team in the University\’s Autism Research Centre. Because those researchers study the difficulties that some people have understanding emotions, their insights help to address the same problems in computers.

Facial expressions are an important way of understanding people\’s feelings. One system tracks features on a person\’s face, calculates the gestures that are being made and infers emotions from them. It gets the right answer over 70% of the time, which is as good as most human observers.

Other systems analyse speech intonation to infer emotions from the way that something is said, and analyse body posture and gestures.

Ian Davies, one of the research students in Professor Robinson\’s team, is looking at applications of these technologies in command and control systems. \”Even in something as simple as a car we need to know if the driver is concentrating and confused, so that we can avoid overloading him with distractions from a mobile phone, the radio, or a satellite navigations system.\”

Merely understanding emotions is not enough. Professor Robinson wants computers to express emotions as well, whether they are cartoon animations, or physical robots.

PhD student Tadas Baltrušaitis, another team member, works on animating figures to mimic a person\’s facial expressions, while fellow PhD candidate Laurel Riek is experimenting with a robotic head modelled on Charles Babbage, which appears in the film.

\”Charles has two dozen motors controlling \’muscles\’ in his face, giving him a wide range of expressions,\” Robinson explains. \”We can use him to explore empathy, rapport building, and co-operation in emotional interactions between people and computers.\”

\”The key to scientific research is to avoid preconceptions and to expect surprises. I just recruit the best graduate students from around the world, make sure that they have the resources that they need and then just let them get on with it. They bounce ideas off each other and solve problems together.\”
The team has to combine results from many disciplines, and this is true for many research problems in computer science. We need to understand psychology, signal processing and statistical machine learning as well as systems engineering to tackle these problems. Because the University has experts in all these fields it\’s a perfect place to do the research.\”