Back to Search Start Over

Characterizing auditory and visual motion processing and integration using submillimeter 7T fMRI: preliminary data

Authors :
UCL - SSH/IPSY - Psychological Sciences Research Institute
UCL - SSS/IONS/COSY - Systems & cognitive Neuroscience
Barilari, Marco
Gau, Remi
Sherif, Syia
Collignon, Olivier
UCL - SSH/IPSY - Psychological Sciences Research Institute
UCL - SSS/IONS/COSY - Systems & cognitive Neuroscience
Barilari, Marco
Gau, Remi
Sherif, Syia
Collignon, Olivier
Publication Year :
2022

Abstract

The ability of the brain to integrate motion information originating from separate sensory modalities is fundamental to efficiently interact with our dynamic environment. The human occipito-temporal visual region hMT+/V5 and the auditory area Planum Temporale (PT) are known to be highly specialized to process visual and auditory motion directions, respectively. In addition to their role in processing the dominant sensory information, it was recently suggested that these regions may also engage in crossmodal motion processing. How multisensory information is represented in these regions remain however poorly understood. To further investigate the multisensory nature of hMT+/V5 and PT, we characterized single-subject activity with ultra-high field (UHF) fMRI (7T) when participants processed horizontal and vertical motion stimuli delivered through vision, audition, or a combination of both modalities simultaneously. Our preliminary results confirmed that in addition to a robust selectivity for visual motion, portion of hMT+/V5 selectively responds to moving sounds and a portion of PT responds to moving visual stimuli. We are now further characterizing the brain activity in the cortical depths using UHF fMRI combined with vascular space occupancy (VASO) recording at high spatial resolution (.75mm isotropic). We hypothesize that hMT+/V5 and PT might encode auditory and visual motion information in separate cortical layers, reflecting the feed-forward versus feed-back nature of how sensory information flows into those regions. This project will shed new lights on how crossmodal information is represented across the depth of the cortical layers of motion selective human brain areas.

Details

Database :
OAIster
Notes :
English
Publication Type :
Electronic Resource
Accession number :
edsoai.on1372927983
Document Type :
Electronic Resource