Back to Search Start Over

Hand-Selective Visual Regions Represent How to Grasp 3D Tools: Brain Decoding during Real Actions

Authors :
Diana Tonin
Stephanie Rossit
Courtney Mansfield
Fraser W. Smith
Ethan Knights
Janak Saada
Source :
The Journal of Neuroscience
Publication Year :
2021
Publisher :
Society for Neuroscience, 2021.

Abstract

Most neuroimaging experiments that investigate how tools and their actions are represented in the brain use visual paradigms where tools or hands are displayed as 2D images and no real movements are performed. These studies discovered selective visual responses in occipito-temporal and parietal cortices for viewing pictures of hands or tools, which are assumed to reflect action processing, but this has rarely been directly investigated. Here, we examined the responses of independently visually defined category-selective brain areas when participants grasped 3D tools. Using real action fMRI and multi-voxel pattern analysis, we found that grasp typicality representations (i.e., whether a tool is being grasped appropriately for use) were decodable from hand-selective areas in occipito-temporal and parietal cortices, but not from tool-, object-, or body-selective areas, even if partially overlapping. Importantly, these effects were exclusive for actions with tools, but not for biomechanically matched actions with control nontools. In addition, decoding of grasp typicality was significantly higher in hand than tool-selective parietal regions. Notably, grasp typicality representations were automatically evoked even when there was no requirement for tool use and participants were naïve to object category (tool vs non-tools). Finding a specificity for typical tool grasping in hand-, rather than tool-, selective regions challenges the long-standing assumption that brain activation for viewing tool images reflects sensorimotor processing linked to tool manipulation. Instead our results show that typicality representations for tool grasping are automatically evoked in visual regions specialised for representing the human hand, the brain’s primarytoolfor interacting with the world.Significance StatementThe unique ability of humans to manufacture and use tools is unsurpassed across the animal kingdom, with tool use considered a defining feature of our species. Most neuroscientific studies that investigate the brain mechanisms that support tool use, record brain activity while people simply view images of tools or hands and not when people perform actual hand movements with tools. Here we show that specific areas of the human visual system that preferentially process hands automatically encode how to appropriately grasp 3D tools, even when no actual tool use is required. These findings suggest that visual areas optimized for processing hands represent fundamental aspects of tool grasping in humans, such as which side they should be grasped for correct manipulation.

Details

ISSN :
15292401 and 02706474
Volume :
41
Database :
OpenAIRE
Journal :
The Journal of Neuroscience
Accession number :
edsair.doi.dedup.....b5cfd85e9ede0b4a8fc92ab5550fb41f
Full Text :
https://doi.org/10.1523/jneurosci.0083-21.2021