Emotionally expressive music and dance occur together across the world. This may be because features shared across the senses are represented the same way even in different sensory brain areas, putting music and movement in directly comparable terms. These shared representations may arise from a general need to identify environmentally relevant combinations of sensory features, particularly those that communicate emotion. To test the hypothesis that visual and auditory brain areas share a representational structure, we created music and animation stimuli with crossmodally matched features expressing a range of emotions. Participants confirmed that each emotion corresponded to a set of features shared across music and movement. A subset of participants viewed both music and animation during brain scanning, revealing that representations in auditory and visual brain areas were similar to one another. This shared representation captured not only simple stimulus features but also combinations of features associated with emotion judgments. The posterior superior temporal cortex represented both music and movement using this same structure, suggesting supramodal abstraction of sensory content. Further exploratory analysis revealed that early visual cortex used this shared representational structure even when stimuli were presented auditorily. We propose that crossmodally shared representations support mutually reinforcing dynamics across auditory and visual brain areas, facilitating crossmodal comparison. These shared representations may help explain why emotions are so readily perceived and why some dynamic emotional expressions can generalize across cultural contexts.
This portends some interesting results with relation to mnemonics and particularly songlines and indigenous peoples’ practices which integrate song, movement, and emotion.
Across the world, people express emotion through music and dance. But why do music and dance go together?
We tested a deceptively simple hypothesis: Music and movement are represented the same way in the brain.
— Beau Sievers (@beausievers) October 12, 2021
ᔥ “New work published today in Current Biology Visual and auditory brain areas share a representational structure that supports emotion perception With @ThaliaWheatley @k_v_n_l @parkinsoncm @sergeyfogelson (thread after coffee!) https://t.co/AURqH9kNLb https://t.co/ro4o4oEwk5” / Twitter ()in