Back to Search Start Over

Task-Aware Harmony Multi-Task Decision Transformer for Offline Reinforcement Learning

Authors :
Fan, Ziqing
Hu, Shengchao
Zhou, Yuhang
Shen, Li
Zhang, Ya
Wang, Yanfeng
Tao, Dacheng
Publication Year :
2024

Abstract

The purpose of offline multi-task reinforcement learning (MTRL) is to develop a unified policy applicable to diverse tasks without the need for online environmental interaction. Recent advancements approach this through sequence modeling, leveraging the Transformer architecture's scalability and the benefits of parameter sharing to exploit task similarities. However, variations in task content and complexity pose significant challenges in policy formulation, necessitating judicious parameter sharing and management of conflicting gradients for optimal policy performance. Furthermore, identifying the optimal parameter subspace for each task often necessitates prior knowledge of the task identifier during inference, limiting applicability in real-world scenarios with variable task content and unknown current tasks. In this work, we introduce the Harmony Multi-Task Decision Transformer (HarmoDT), a novel solution designed to identify an optimal harmony subspace of parameters for each task. We formulate this as a bi-level optimization problem within a meta-learning framework, where the upper level learns masks to define the harmony subspace, while the inner level focuses on updating parameters to improve the overall performance of the unified policy. To eliminate the need for task identifiers, we further design a group-wise variant (G-HarmoDT) that clusters tasks into coherent groups based on gradient information, and utilizes a gating network to determine task identifiers during inference. Empirical evaluations across various benchmarks highlight the superiority of our approach, demonstrating its effectiveness in the multi-task context with specific improvements of 8% gain in task-provided settings, 5% in task-agnostic settings, and 10% in unseen settings.<br />Comment: Extension of corresponding ICML edition arXiv:2405.18080. arXiv admin note: substantial text overlap with arXiv:2405.18080

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2411.01146
Document Type :
Working Paper