1. Learning to Walk by Imitation in Low-Dimensional Subspaces
- Author
-
Rajesh P. N. Rao, Rawichote Chalodhorn, Keith Grochow, and David B. Grimes
- Subjects
business.industry ,Computer science ,Dimensionality reduction ,media_common.quotation_subject ,Kinematics ,Motion capture ,Motion (physics) ,Computer Science Applications ,Computer Science::Robotics ,Human-Computer Interaction ,Complex dynamics ,Hardware and Architecture ,Control and Systems Engineering ,Robot ,Computer vision ,Artificial intelligence ,Imitation ,business ,Software ,Humanoid robot ,ComputingMethodologies_COMPUTERGRAPHICS ,media_common - Abstract
In this paper, we provide the first demonstration that a humanoid robot can learn to walk directly by imitating a human gait obtained from motion capture (mocap) data without any prior information of its dynamics model. Programming a humanoid robot to perform an action (such as walking) that takes into account the robot's complex dynamics is a challenging problem. Traditional approaches typically require highly accurate prior knowledge of the robot's dynamics and environment in order to devise complex (and often brittle) control algorithms for generating a stable dynamic motion. Training using human mocap is an intuitive and flexible approach to programming a robot, but direct usage of mocap data usually results in dynamically unstable motion. Furthermore, optimization using high-dimensional mocap data in the humanoid full-body joint space is typically intractable. We propose a new approach to tractable imitation-based learning in humanoids without a robot's dynamic model. We represent kinematic informati...
- Published
- 2010