Back to Search Start Over

A framework for interactive responsive animation.

Authors :
Hanqiu Sun
Green, Mark
Source :
Journal of Visualization & Computer Animation; May2000, Vol. 11 Issue 2, p83-93, 12p, 1 Black and White Photograph, 3 Diagrams, 1 Chart
Publication Year :
2000

Abstract

The availability of high-performance 3D workstations has increased the range of application for interactive real-time animation. In these applications the user can directly interact with the objects in the animation and direct the evolution of their motion, rather than simply watching a pre-computed animation sequence. Interactive real-time animation has fast-growing applications in virtual reality, scientific visualization, medical training and distant learning. Traditional approaches to computer animation have been based on the animator having complete control over all aspects of the motion. In interactive animation the user can interact with any of the objects, which changes the current motion path or behaviour in real time. The objects in the animation must be capable of reacting to the user's actions and not simply replay a canned motion sequence. This paper presents a framework for interactive animation that allows the animator to specify the reactions of objects to events generated by other objects and the user. This framework is based on the concept of relations that describe how an object reacts to the influence of a dynamic environment. Each relation specifies one motion primitive triggered by either its enabling condition or the state of the environment. A collection of the relations is structured through several hierarchical layers to produce responsive behaviours and their variations. This framework is illustrated by several room-based dancing examples that are modelled by relations. Copyright © 2000 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10498907
Volume :
11
Issue :
2
Database :
Complementary Index
Journal :
Journal of Visualization & Computer Animation
Publication Type :
Academic Journal
Accession number :
13509713
Full Text :
https://doi.org/10.1002/1099-1778(200005)11:2<83::AID-VIS220>3.0.CO;2-C