Back to Search Start Over

Integrating visual and tactile information in the perirhinal cortex.

Authors :
Holdstock JS
Hocking J
Notley P
Devlin JT
Price CJ
Source :
Cerebral cortex (New York, N.Y. : 1991) [Cereb Cortex] 2009 Dec; Vol. 19 (12), pp. 2993-3000. Date of Electronic Publication: 2009 Apr 22.
Publication Year :
2009

Abstract

By virtue of its widespread afferent projections, perirhinal cortex is thought to bind polymodal information into abstract object-level representations. Consistent with this proposal, deficits in cross-modal integration have been reported after perirhinal lesions in nonhuman primates. It is therefore surprising that imaging studies of humans have not observed perirhinal activation during visual-tactile object matching. Critically, however, these studies did not differentiate between congruent and incongruent trials. This is important because successful integration can only occur when polymodal information indicates a single object (congruent) rather than different objects (incongruent). We scanned neurologically intact individuals using functional magnetic resonance imaging (fMRI) while they matched shapes. We found higher perirhinal activation bilaterally for cross-modal (visual-tactile) than unimodal (visual-visual or tactile-tactile) matching, but only when visual and tactile attributes were congruent. Our results demonstrate that the human perirhinal cortex is involved in cross-modal, visual-tactile, integration and, thus, indicate a functional homology between human and monkey perirhinal cortices.

Details

Language :
English
ISSN :
1460-2199
Volume :
19
Issue :
12
Database :
MEDLINE
Journal :
Cerebral cortex (New York, N.Y. : 1991)
Publication Type :
Academic Journal
Accession number :
19386635
Full Text :
https://doi.org/10.1093/cercor/bhp073