Our sense of touch can contribute to our ability to perceive
faces, according to new research published in Psychological Science, a journal
of the Association for Psychological Science.
“In daily life, we usually recognise faces through sight and
almost never explore them through touch,” says lead researcher Kazumichi
Matsumiya of Tohoku University in Japan. “But we use information from multiple
sensory modalities in order to perceive many everyday non-face objects and
events, such as speech perception or object recognition. These new findings
suggest that even face processing is essentially multisensory.”
In a series of studies, Matsumiya took advantage of a
phenomenon called the “face aftereffect” to investigate whether our visual
system responds to non visual signals for processing faces. In the face
after effect, we adapt to a face with particular expression happiness, for example
which causes us to perceive a subsequent neutral face as having the opposite
facial expression (i.e., sadness).
Matsumiya hypothesised that if the visual system really does
respond to signals from another modality, then we should see evidence for face
after effects from one modality to the other. So, adaptation to a face that is
explored by touch should produce visual face after effects.
To test this, Matsumiya had participants explore face masks
concealed below a mirror by touching them. After this adaptation period, the
participants were visually presented with a series of faces that had varying
expressions and were asked to classify the faces as happy or sad. The visual
faces and the masks were created from the same exemplar.Exploring by touch
In line with his hypothesis, Matsumiya found that
participants’ experiences exploring the face masks by touch shifted their
perception of the faces presented visually compared to participants who had no
adaptation period, such that the visual faces were perceived as having the
opposite facial expression.
Further experiments ruled out other explanations for the
results, including the possibility that the face after effects emerged because
participants were intentionally imagining visual faces during the adaptation
And a fourth experiment revealed that the after effect also
works the other way: visual stimuli can influence how we perceive a face
According to Matsumiya, current views on face processing
assume that the visual system only receives facial signals from the visual
modality but these experiments suggest that face perception is truly cross modal.
“These findings suggest that facial information may be coded
in a shared representation between vision and haptics in the brain,” notes
Matsumiya, suggesting that these findings may have implications for enhancing
vision and telecommunication in the development of aids for the visually impaired.