When people talk, sing, or dance, they often tilt their heads in a manner that would emphasize the message they want to convey. Although facial expressions and sound both provide much clearer emotions, a recent study suggests head movements may just be enough.
Body language is a form of nonverbal communication that does not need spoken words just to relay information. With the right tilt of the head, a person can express different feelings - be it happy or sad. People can be accurate at judging emotions even without hearing a sound or seeing facial expressions.
Well, researchers from McGill University in Montreal agree to that. They found that head movements were really helpful in making people acquire the thoughts of others.
Steven R. Livingstone and Caroline Palmer from the MU's Department of Psychology, examined how the emotions expressed by vocalists influenced their head movements, and how other people perceived such gestures. Their team used motion-capture equipment to track the vocalist's head movements in three dimensions while they spoke or sang with emotions. The feelings were classified as either very happy, happy, neutral, sad, very sad. The video clips of the head movements were presented to the viewers, and they were asked to identify the vocalists' emotions without hearing the audio or seeing the facial expressions.
They found that when people talked, the ways in which they moved their heads revealed the emotions that they're expressing. The viewers were also remarkably accurate at identifying a speaker's emotion, just by seeing their head movements.
"While the head movements for happy and sad emotions differed, they were highly similar across speech and song, despite differences in vocal acoustics," says Livingstone, also a postdoctoral fellow at McMaster University, in a press release. "Although the research was based on North American English speakers, the focus on head movements creates the possibility for studying emotional communication in contexts where different languages are spoken."
Researchers believe that with their study, it may be possible to develop new applications such as social robotics and automated emotion recognition - both deemed useful in situations where sound is not available.
The study appears in the journal Emotion.