Back to Search Start Over

Skill Transfer for Temporally-Extended Task Specifications

Authors :
Liu, Jason Xinyu
Shah, Ankit
Rosen, Eric
Konidaris, George
Tellex, Stefanie
Liu, Jason Xinyu
Shah, Ankit
Rosen, Eric
Konidaris, George
Tellex, Stefanie
Publication Year :
2022

Abstract

Deploying robots in real-world domains, such as households and flexible manufacturing lines, requires the robots to be taskable on demand. Linear temporal logic (LTL) is a widely-used specification language with a compositional grammar that naturally induces commonalities across tasks. However, the majority of prior research on reinforcement learning with LTL specifications treats every new formula independently. We propose LTL-Transfer, a novel algorithm that enables subpolicy reuse across tasks by segmenting policies for training tasks into portable transition-centric skills capable of satisfying a wide array of unseen LTL specifications while respecting safety-critical constraints. Experiments in a Minecraft-inspired domain show that LTL-Transfer can satisfy over 90% of 500 unseen tasks after training on only 50 task specifications and never violating a safety constraint. We also deployed LTL-Transfer on a quadruped mobile manipulator in an analog household environment to demonstrate its ability to transfer to many fetch and delivery tasks in a zero-shot fashion.

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1333778277
Document Type :
Electronic Resource