Back to Search Start Over

Zero-Shot Dependency Parsing with Worst-Case Aware Automated Curriculum Learning

Authors :
de Lhoneux, Miryam
Zhang, Sheng
Søgaard, Anders
Publication Year :
2022

Abstract

Large multilingual pretrained language models such as mBERT and XLM-RoBERTa have been found to be surprisingly effective for cross-lingual transfer of syntactic parsing models (Wu and Dredze 2019), but only between related languages. However, source and training languages are rarely related, when parsing truly low-resource languages. To close this gap, we adopt a method from multi-task learning, which relies on automated curriculum learning, to dynamically optimize for parsing performance on outlier languages. We show that this approach is significantly better than uniform and size-proportional sampling in the zero-shot setting.<br />Comment: ACL 2022

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2203.08555
Document Type :
Working Paper