advertisement
12 September 2012

Computer being developed to read people's lips

A computer is being taught to interpret human emotions based on lip pattern, according to research.

0

A computer is being taught to interpret human emotions based on lip pattern, according to research published in the International Journal of Artificial Intelligence and Soft Computing.

 The system could improve the way we interact with computers and perhaps allow disabled people to use computer-based communications devices, such as voice synthesisers, more effectively and more efficiently.

How it was done

Karthigayan Muthukaruppanof Manipal International University in Selangor, Malaysia, and co-workers have developed a system using a genetic algorithm that gets better and better with each iteration to match irregular ellipse fitting equations to the shape of the human mouth displaying different emotions.

They have used photos of individuals from South-East Asia and Japan to train a computer to recognise the six commonly accepted human emotions - happiness, sadness, fear, angry, disgust, surprise - and a neutral expression. The upper and lower lip is each analysed as two separate ellipses by the algorithm.

Interaction between human and computers

"In recent years, there has been a growing interest in improving all aspects of interaction between humans and computers especially in the area of human emotion recognition by observing facial expression," the team explains. Earlier researchers have developed an understanding that allows emotion to be recreated by manipulating a representation of the human face on a computer screen.

Such research is currently informing the development of more realistic animated actors and even the behaviour of robots. However, the inverse process in which a computer recognises the emotion behind a real human face is still a difficult problem to tackle.

It is well known that many deeper emotions are betrayed by more than movements of the mouth. A genuine smile for instance involves flexing of muscles around the eyes and eyebrow movements are almost universally essential to the subconscious interpretation of a person's feelings. However, the lips remain a crucial part of the outward expression of emotion.

The team's algorithm can successfully classify the seven emotions and a neutral expression described. The researchers suggest that initial applications of such an emotion detector might be helping disabled patients lacking speech to interact more effectively with computer-based communication devices, for instance.

(EurekAlert, September 2012)

Read more: 

Computers and your eyes

Why do vivid memories feel so real

No more PC headaches

 
NEXT ON HEALTH24X

More:

ManNews
advertisement

Read Health24’s Comments Policy

Comment on this story
0 comments
Comments have been closed for this article.

Live healthier

Contraceptives and you »

Scientists create new contraceptive from seaweed Poor long-term birth control training leads to 'accidents'

7 birth control myths you should stop believing

Will the Pill make you gain weight? Can you fall pregnant while breastfeeding? We bust seven common myths about birth control.

Your digestive health »

Causes of digestive disorders 9 habits that could hurt your digestive system

Your tummy rumblings might help diagnose bowel disorder

With the assistance of an 'acoustic belt', doctors can now determine the cause of your tummy troubles.