Back to Search Start Over

Using Deep Mixture-of-Experts to Detect Word Meaning Shift for TempoWiC

Authors :
Chen, Ze
Wang, Kangxu
Cai, Zijian
Zheng, Jiewen
He, Jiarong
Gao, Max
Zhang, Jason
Publication Year :
2022

Abstract

This paper mainly describes the dma submission to the TempoWiC task, which achieves a macro-F1 score of 77.05% and attains the first place in this task. We first explore the impact of different pre-trained language models. Then we adopt data cleaning, data augmentation, and adversarial training strategies to enhance the model generalization and robustness. For further improvement, we integrate POS information and word semantic representation using a Mixture-of-Experts (MoE) approach. The experimental results show that MoE can overcome the feature overuse issue and combine the context, POS, and word semantic features well. Additionally, we use a model ensemble method for the final prediction, which has been proven effective by many research works.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2211.03466
Document Type :
Working Paper