1. Behavior and perception in virtual environments
- Author
-
Chen, Colleen
- Subjects
Cognitive psychology - Abstract
Standing is a complex process that involves integrating sensory information provided by the visual, proprioceptive, and vestibular systems. These inputs are processed centrally by several areas of the brain, including the cerebellum, brainstem, basal ganglia, and sensorimotor cortex (Andersen & Zipser, 1988), and are used to control limb and trunk muscles which receive signals from the spinal cord and peripheral nerves. A set of experiments which rely on virtual reality technology have been conducted which manipulate visual input and cognitive load in order to understand better the effects of sensory and cognitive influence on the maintenance of upright posture. In the first study, I designed a series of experiments in a virtual environment to investigate posture responses to visual motion. The results confirmed earlier findings of posture responses to visual motion along the horizontal axes (mediolateral and anteroposterior). We did not find posture responses to visual motion along the vertical axis in the 2D weight change data. Further investigation of head rotations and movements reveals that posture responses to horizontal and vertical visual motion depend largely on the presence of target fixation. This dissertation explores other cognitive factors impacting the maintenance of balance and upright stance. In a second study designed to investigate cognitive influences on postural sway, a classical paradigm from Brooks (1968) was adapted in a virtual environment to measure sway magnitude during spatial or non-spatial tasks. The results did not show conclusive evidence that spatial tasks induced greater magnitude of postural sway than non-spatial tasks, as was reported by some earlier studies (Maylor et al., 2001; Swan et al., 2004). In fact, the data suggest that participants swayed more when they were not engaged in any cognitive tasks than when they were engaged in spatial or non-spatial tasks. This result is consistent with an earlier study conducted by Swan and colleagues (2004), which reported improved balance when subjects are performing a secondary cognitive task. This finding may reflect that one’s awareness of balanced-related cues leads to greater postural sway. This dissertation also investigates the perceptual limits of gesture-based communication in collaborative environments. In the last study, experiments are implemented in a virtual environment where the experimenter’s hand traces a sequence of three letters while the participants attempt to identify these letters. We manipulated the frame rate of the virtual hand to assess how gesture recognition declines as visual motion cues are degraded. I measured how gesture recognition performance varies as a function of framerate using the method of constant stimuli. I fit psychometric functions to the recognition performance data and found that between two to three updates per second yield the greatest improvement in performance. I also investigated the word superiority effect (McClelland and Johnston, 1977) by having participants identify gesture sequences that either formed words or non-words. The results confirm that performance is higher for identifying words as opposed to non-words (p
- Published
- 2021