Back to Search
Start Over
Automatic Selection of Viewpoint for Digital Human Modelling
- Publication Year :
- 2020
-
Abstract
- During concept design of new vehicles, work places, and other complex artifacts, it is critical to assess positioning of instruments and regulators from the perspective of the end user. One common way to do these kinds of assessments during early product development is by the use of Digital Human Modelling (DHM). DHM tools are able to produce detailed simulations, including vision. Many of these tools comprise evaluations of direct vision and some tools are also able to assess other perceptual features. However, to our knowledge, all DHM tools available today require manual selection of manikin viewpoint. This can be both cumbersome and difficult, and requires that the DHM user possesses detailed knowledge about visual behavior of the workers in the task being modelled. In the present study, we take the first steps towards an automatic selection of viewpoint through a computational model of eye-hand coordination. We here report descriptive statistics on visual behavior in a pick-and-place task executed in virtual reality. During reaching actions, results reveal a very high degree of eye-gaze towards the target object. Participants look at the target object at least once during basically every trial, even during a repetitive action. The object remains focused during large proportions of the reaching action, even when participants are forced to move in order to reach the object. These results are in line with previous research on eye-hand coordination and suggest that DHM tools should, by default, set the viewpoint to match the manikin’s grasping location.<br />CC BY-NC 4.0Funder: Knowledge Foundation and the INFINIT research environment (KKS Dnr. 20180167). This work was financially supported by the synergy Virtual Ergonomics funded by the Swedish Knowledge Foundation, dnr 20180167. https://www.his.se/sve
Details
- Database :
- OAIster
- Notes :
- application/pdf, English
- Publication Type :
- Electronic Resource
- Accession number :
- edsoai.on1233569962
- Document Type :
- Electronic Resource
- Full Text :
- https://doi.org/10.3233.ATDE200010