Shared Representation of Visual and Auditory Motion Directions in the Human Middle-Temporal Cortex

Mohamed Rezk*, Stephanie Cattoir, Ceren Battal, Valeria Occelli, Stefania Mattioni, Olivier Collignon

*Corresponding author for this work

Research output: Contribution to journalArticle (journal)peer-review

23 Citations (Scopus)

Abstract

The human occipito-temporal region hMT+/V5 is well known for processing visual motion direction. Here, we demonstrate that hMT+/V5 also represents the direction of auditory motion in a format partially aligned with the one used to code visual motion. We show that auditory and visual motion directions can be reliably decoded in individually localized hMT+/V5 and that motion directions in one modality can be predicted from the activity patterns elicited by the other modality. Despite shared motion-direction information across the senses, vision and audition, however, overall produce opposite voxel-wise responses in hMT+/V5. Our results reveal a multifaced representation of multisensory motion signals in hMT+/V5 and have broader implications for our understanding of how we consider the division of sensory labor between brain regions dedicated to a specific perceptual function.

Original languageEnglish
Pages (from-to)2289-2299.e8
JournalCurrent Biology
Volume30
Issue number12
DOIs
Publication statusPublished - 22 Jun 2020

Keywords

  • auditory
  • cross-modal
  • decoding
  • fMRI
  • hMT/V5
  • motion
  • multimodal
  • MVPA
  • RSA
  • visual

Fingerprint

Dive into the research topics of 'Shared Representation of Visual and Auditory Motion Directions in the Human Middle-Temporal Cortex'. Together they form a unique fingerprint.

Cite this