Computer recognises fake emotions better than humans? Yes, according to a new study!

There has always been a competition between the human brain and the increasingly developing technologies. However, in the most recent battle, the computers seem to have emerged victorious.

According to a new study, computers are entering territory that has been exclusively reserved for humans, the field of emotions. ‘Expression recognition’ software is now said to perform better than humans at the task of discriminating between genuine and fake displays of emotion. (Read: Addictive ‘Flappy Bird’ set for a comeback)

Researchers at The University of California in San Diego, US, have developed a computer system that is adept at pattern recognition. When humans contested against this system, the latter was much better at differentiating on the basis of authenticity of emotion.

Humans, in general, have a fifty percent chance of figuring out whether the emotion shown by someone is real or not. This was further emphasized in the study. Whereas the mechanical counterpart was accurate 85 percent of the time, due to correct analysis of facial expressions that people missed out on.

Computers have always excelled at processes that comprise of logic, and have thus out-performed humans in this aspect. However, they have usually lagged behind in perceptual processes which humans find simple. This was until now.

The experiment began with volunteers being told to immerse their arm in lukewarm water for a minute. Then they were asked to fool an expert into thinking that they were in pain. In the second part of the experiment, however, the volunteers had to actually immerse their arm into freezing ice water for a minute, but they were not given any instructions on what to do with their expressions.

There were a second set of ‘experts’ who were actually other volunteers, and they had to assess which people from the first group were in actual discomfort. They registered a fifty percent accuracy rate, and even after being trained on how to recognize when someone was in pain, they managed only five percent more.

The computer, on the other hand, had a video camera which took pictures of the person’s face and decoded them. This was done to recognize the movement combinations in the face which suggested true and fake pain. Their accuracy was a remarkable 85 percent.

It was not surprising that humans performed badly on this task, since even trained physicians cannot sufficiently pick genuine expressions. The reason for human failure in the experiment was because we are capable of simulating emotions, at times well enough to deceive other people. This usually happens when social convention says that people must feel a certain way at certain times, like sorrow during a funeral.

In this regard, the computer was better at spotting the subtle differences between voluntary and involuntary facial movements, by analyzing twenty facial muscles in every frame of the video. However, it actually performed better than expected, because of their newly programmed ability to pick out indicators. Our facial movements are controlled by two types of muscles – one that chooses out expressions and the other, involuntary. When we want to fake an emotion, the muscles are directed to what the face should look like, however there are subtle differences. These cannot be seen by human eyes, but are picked up by the computer lens. The most indicative of all facial features was the movement of the mouth, according to the researchers.

Researchers said that they hoped to develop a widely available and inexpensive system like this, which would not only help in cases of fraud and other criminal fields like law enforcement, but also aid in recognizing emotional states of people who are unable to communicate adequately.