Back to Search Start Over

Talk2Car: Taking Control of Your Self-Driving Car

Authors :
Thierry Deruyttere
Marie-Francine Moens
Dusan Grujicic
Luc Van Gool
Simon Vandenhende
Source :
EMNLP/IJCNLP (1)
Publication Year :
2019
Publisher :
arXiv, 2019.

Abstract

A long-term goal of artificial intelligence is to have an agent execute commands communicated through natural language. In many cases the commands are grounded in a visual environment shared by the human who gives the command and the agent. Execution of the command then requires mapping the command into the physical visual space, after which the appropriate action can be taken. In this paper we consider the former. Or more specifically, we consider the problem in an autonomous driving setting, where a passenger requests an action that can be associated with an object found in a street scene. Our work presents the Talk2Car dataset, which is the first object referral dataset that contains commands written in natural language for self-driving cars. We provide a detailed comparison with related datasets such as ReferIt, RefCOCO, RefCOCO+, RefCOCOg, Cityscape-Ref and CLEVR-Ref. Additionally, we include a performance analysis using strong state-of-the-art models. The results show that the proposed object referral task is a challenging one for which the models show promising results but still require additional research in natural language processing, computer vision and the intersection of these fields. The dataset can be found on our website: http://macchina-ai.eu/<br />Comment: 14 pages, accepted at emnlp-ijcnlp 2019 - Added Talk2Nav Reference

Details

Database :
OpenAIRE
Journal :
EMNLP/IJCNLP (1)
Accession number :
edsair.doi.dedup.....bb6094d8645fbdc3d12b9173e0be7e1f
Full Text :
https://doi.org/10.48550/arxiv.1909.10838