1. Hand function, not proximity, biases visuotactile integration later in object processing: An ERP study
- Author
-
Catherine L. Reed, Daivik B. Vyas, and John P. Garza
- Subjects
Adult ,Male ,Adolescent ,Experimental and Cognitive Psychology ,Motor Activity ,Electroencephalography ,050105 experimental psychology ,Visual processing ,Young Adult ,03 medical and health sciences ,0302 clinical medicine ,Arts and Humanities (miscellaneous) ,Event-related potential ,Developmental and Educational Psychology ,medicine ,Humans ,Attention ,0501 psychology and cognitive sciences ,Evoked Potentials ,Haptic technology ,Cerebral Cortex ,Hand function ,medicine.diagnostic_test ,05 social sciences ,Multisensory integration ,Hand ,Event-Related Potentials, P300 ,Touch Perception ,Action (philosophy) ,Receptive field ,Space Perception ,Visual Perception ,Female ,Psychology ,030217 neurology & neurosurgery ,Cognitive psychology - Abstract
Behavioral studies document a functional hand proximity effect: objects near the palm, but not the back of the hand, affect visual processing. Although visuotactile bimodal neurons integrate visual and haptic inputs, their receptive fields in monkey cortex encompass the whole hand, not just the palm. Using ERPs, we investigated whether hand function influenced the topology of integrated space around the hand. In a visual detection paradigm, target and non-target stimuli appeared equidistantly in front or in back of the hand. Equivalent N1 amplitudes were found for both conditions. P3 target versus non-target amplitude differences were greater for palm conditions. Hand proximity biased processing of visual targets equidistant from the hand early in processing. However, hand function biases emerged later when targets were selected for potential action. Thus, early hand proximity effects on object processing depend on sensory-reliant neural responses, whereas later multisensory integration depend more on the hand's functional expertise.
- Published
- 2019
- Full Text
- View/download PDF