Back to Search Start Over

RoboCodeX: Multimodal Code Generation for Robotic Behavior Synthesis

Authors :
Mu, Yao
Chen, Junting
Zhang, Qinglong
Chen, Shoufa
Yu, Qiaojun
Ge, Chongjian
Chen, Runjian
Liang, Zhixuan
Hu, Mengkang
Tao, Chaofan
Sun, Peize
Yu, Haibao
Yang, Chao
Shao, Wenqi
Wang, Wenhai
Dai, Jifeng
Qiao, Yu
Ding, Mingyu
Luo, Ping
Publication Year :
2024

Abstract

Robotic behavior synthesis, the problem of understanding multimodal inputs and generating precise physical control for robots, is an important part of Embodied AI. Despite successes in applying multimodal large language models for high-level understanding, it remains challenging to translate these conceptual understandings into detailed robotic actions while achieving generalization across various scenarios. In this paper, we propose a tree-structured multimodal code generation framework for generalized robotic behavior synthesis, termed RoboCodeX. RoboCodeX decomposes high-level human instructions into multiple object-centric manipulation units consisting of physical preferences such as affordance and safety constraints, and applies code generation to introduce generalization ability across various robotics platforms. To further enhance the capability to map conceptual and perceptual understanding into control commands, a specialized multimodal reasoning dataset is collected for pre-training and an iterative self-updating methodology is introduced for supervised fine-tuning. Extensive experiments demonstrate that RoboCodeX achieves state-of-the-art performance in both simulators and real robots on four different kinds of manipulation tasks and one navigation task.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2402.16117
Document Type :
Working Paper