Abstract
The human occipito-temporal region hMT+/V5 is well known for processing visual motion direction. Here, we demonstrate that hMT+/V5 also represents the direction of auditory motion in a format partially aligned with the one used to code visual motion. We show that auditory and visual motion directions can be reliably decoded in individually localized hMT+/V5 and that motion directions in one modality can be predicted from the activity patterns elicited by the other modality. Despite shared motion-direction information across the senses, vision and audition, however, overall produce opposite voxel-wise responses in hMT+/V5. Our results reveal a multifaced representation of multisensory motion signals in hMT+/V5 and have broader implications for our understanding of how we consider the division of sensory labor between brain regions dedicated to a specific perceptual function.
Original language | English |
---|---|
Pages (from-to) | 2289-2299.e8 |
Journal | Current Biology |
Volume | 30 |
Issue number | 12 |
DOIs | |
Publication status | Published - 22 Jun 2020 |
Keywords
- auditory
- cross-modal
- decoding
- fMRI
- hMT/V5
- motion
- multimodal
- MVPA
- RSA
- visual