Bookmarked Visual and auditory brain areas share a representational structure that supports emotion perception by Beau Sievers, Carolyn Parkinson, Peter J. Kohler, James M. Hughes, Sergey V. Fogelson, Thalia Wheatley (Current Biology)
Emotionally expressive music and dance occur together across the world. This may be because features shared across the senses are represented the same way even in different sensory brain areas, putting music and movement in directly comparable terms. These shared representations may arise from a general need to identify environmentally relevant combinations of sensory features, particularly those that communicate emotion. To test the hypothesis that visual and auditory brain areas share a representational structure, we created music and animation stimuli with crossmodally matched features expressing a range of emotions. Participants confirmed that each emotion corresponded to a set of features shared across music and movement. A subset of participants viewed both music and animation during brain scanning, revealing that representations in auditory and visual brain areas were similar to one another. This shared representation captured not only simple stimulus features but also combinations of features associated with emotion judgments. The posterior superior temporal cortex represented both music and movement using this same structure, suggesting supramodal abstraction of sensory content. Further exploratory analysis revealed that early visual cortex used this shared representational structure even when stimuli were presented auditorily. We propose that crossmodally shared representations support mutually reinforcing dynamics across auditory and visual brain areas, facilitating crossmodal comparison. These shared representations may help explain why emotions are so readily perceived and why some dynamic emotional expressions can generalize across cultural contexts.
This portends some interesting results with relation to mnemonics and particularly songlines and indigenous peoples’ practices which integrate song, movement, and emotion.

Preprint: https://www.biorxiv.org/content/10.1101/254961v4

Beau Sievers in “New work published today in Current Biology Visual and auditory brain areas share a representational structure that supports emotion perception With @ThaliaWheatley @k_v_n_l @parkinsoncm @sergeyfogelson (thread after coffee!) https://t.co/AURqH9kNLb https://t.co/ro4o4oEwk5” / Twitter ()

Published by

Chris Aldrich

I'm a biomedical and electrical engineer with interests in information theory, complexity, evolution, genetics, signal processing, IndieWeb, theoretical mathematics, and big history. I'm also a talent manager-producer-publisher in the entertainment industry with expertise in representation, distribution, finance, production, content delivery, and new media.

One thought on “”

Leave a Reply

Your email address will not be published. Required fields are marked *