201. Towards Low-Resource Semi-Supervised Dialogue Generation with Meta-Learning
- Author
-
Shuo Ma, Yi Huang, Xiaoyu Du, Junlan Feng, and Xiaoting Wu
- Subjects
Computer science ,Entropy (statistical thermodynamics) ,Low resource ,business.industry ,05 social sciences ,Supervised learning ,010501 environmental sciences ,Machine learning ,computer.software_genre ,01 natural sciences ,Regularization (mathematics) ,0502 economics and business ,Entropy (information theory) ,Artificial intelligence ,050207 economics ,Entropy (energy dispersal) ,business ,computer ,0105 earth and related environmental sciences - Abstract
In this paper, we propose a meta-learning based semi-supervised explicit dialogue state tracker (SEDST) for neural dialogue generation, denoted as MEDST. Our main motivation is to further bridge the chasm between the need for high accuracy dialogue state tracker and the common reality that only scarce annotated data is available for most real-life dialogue tasks. Specifically, MEDST has two core steps: meta-training with adequate unlabelled data in an automatic way and meta-testing with a few annotated data by supervised learning. In particular, we enhance SEDST via entropy regularization, and investigate semi-supervised learning frameworks based on model-agnostic meta-learning (MAML) that are able to reduce the amount of required intermediate state labelling. We find that by leveraging un-annotated data in meta-way instead, the amount of dialogue state annotations can be reduced below 10% while maintaining equivalent system performance. Experimental results show MEDST outperforms SEDST substantially by 18.7% joint goal accuracy and 14.3% entity match rate on the KVRET corpus with 2% labelled data in semi-supervision.
- Published
- 2020
- Full Text
- View/download PDF