Back to Search
Start Over
Vision-based urban navigation procedures for verbally instructed robots
- Source :
- Robotics and Autonomous Systems. April 30, 2005, Vol. 51 Issue 1, p69, 12 p.
- Publication Year :
- 2005
-
Abstract
- To link to full-text access for this article, visit this link: http://dx.doi.org/10.1016/j.robot.2004.08.011 Byline: Theocharis Kyriacou, Guido Bugmann, Stanislao Lauria Keywords: Urban navigation; Road layout recognition; Robot primitives; Route instructions; Template matching Abstract: When humans explain a task to be executed by a robot they decompose it into chunks of actions. These form a chain of search-and-act sensory-motor loops that exit when a condition is met. In this paper we investigate the nature of these chunks in an urban visual navigation context, and propose a method for implementing the corresponding robot primitives such as 'take the nth turn right/left'. These primitives make use of a 'short-lived' internal map updated as the robot moves along. The recognition and localisation of intersections is done in the map using task-guided template matching. This approach takes advantage of the content of human instructions to save computation time and improve robustness. Author Affiliation: Centre for Interactive Intelligent Systems, School of Computing, Communications and Electronics, University of Plymouth, Drake Circus, Plymouth PL48AA, UK Article History: Received 27 July 2004; Accepted 12 August 2004
- Subjects :
- Robotics industry
Robots
Robotics industry
Robot
Computers
Subjects
Details
- Language :
- English
- ISSN :
- 09218890
- Volume :
- 51
- Issue :
- 1
- Database :
- Gale General OneFile
- Journal :
- Robotics and Autonomous Systems
- Publication Type :
- Academic Journal
- Accession number :
- edsgcl.195751351