1. Segmented space: measuring tactile localisation in body coordinates
- Author
-
Laurence R. Harris, Vanessa Harrar, and Lisa M. Pritchett
- Subjects
Male ,Cognitive Neuroscience ,Coordinate system ,Experimental and Cognitive Psychology ,Stimulus (physiology) ,Somatosensory system ,Functional Laterality ,Young Adult ,Orientation ,medicine ,Body Image ,Humans ,Computer vision ,Visual Cortex ,Communication ,Crossmodal ,business.industry ,Neurosciences ,Somatosensory Cortex ,Torso ,Sensory Systems ,Ophthalmology ,medicine.anatomical_structure ,Touch Perception ,Touch ,Head Movements ,Space Perception ,Arm ,Visual Perception ,Female ,Computer Vision and Pattern Recognition ,Artificial intelligence ,Cues ,business ,Psychology ,Reference frame ,Mental image - Abstract
Previous research showing systematic localisation errors in touch perception related to eye and head position has suggested that touch is at least partially localised in a visual reference frame. However, many previous studies had participants report the location of tactile stimuli relative to a visual probe, which may force coding into a visual reference. Also, the visual probe could itself be subject to an effect of eye or head position. Thus, it is necessary to assess the perceived position of a tactile stimulus using a within-modality measure in order to make definitive conclusions about the coordinate system in which touch might be coded. Here, we present a novel method for measuring the perceived location of a touch in body coordinates: the Segmented Space Method (SSM). In the SSM participants imagine the region within which the stimulus could be presented divided into several equally spaced, and numbered, segments. Participants then simply report the number corresponding to the segment in which they perceived the stimulus. The SSM represents a simple and novel method that can be easily extended to other modalities by dividing any response space into numbered segments centred on some appropriate reference point (e.g. the head, the torso, the hand, or some point in space off the body). Here we apply SSM to the forearm during eccentric viewing and report localisation errors for touch similar to those previously reported using a crossmodal comparison. The data collected with the SSM strengthen the theory that tactile spatial localisation is generally coded in a visual reference frame even when visual coding is not required by the task.
- Published
- 2016
- Full Text
- View/download PDF