Back to Search Start Over

Controllable Prompt Tuning For Balancing Group Distributional Robustness

Authors :
Phan, Hoang
Wilson, Andrew Gordon
Lei, Qi
Publication Year :
2024

Abstract

Models trained on data composed of different groups or domains can suffer from severe performance degradation under distribution shifts. While recent methods have largely focused on optimizing the worst-group objective, this often comes at the expense of good performance on other groups. To address this problem, we introduce an optimization scheme to achieve good performance across groups and find a good solution for all without severely sacrificing performance on any of them. However, directly applying such optimization involves updating the parameters of the entire network, making it both computationally expensive and challenging. Thus, we introduce Controllable Prompt Tuning (CPT), which couples our approach with prompt-tuning techniques. On spurious correlation benchmarks, our procedures achieve state-of-the-art results across both transformer and non-transformer architectures, as well as unimodal and multimodal data, while requiring only 0.4% tunable parameters.<br />Comment: Proceedings of the 41st International Conference on Machine Learning

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2403.02695
Document Type :
Working Paper