Back to Search Start Over

Few-Shot Transfer Learning for Text Classification With Lightweight Word Embedding Based Models

Authors :
Gong Jianxing
Jian Huang
Chongyu Pan
Xingsheng Yuan
Source :
IEEE Access, Vol 7, Pp 53296-53304 (2019)
Publication Year :
2019
Publisher :
Institute of Electrical and Electronics Engineers (IEEE), 2019.

Abstract

Many deep learning architectures have been employed to model the semantic compositionality for text sequences, requiring a huge amount of supervised data for parameters training, making it unfeasible in situations where numerous annotated samples are not available or even do not exist. Different from data-hungry deep models, lightweight word embedding-based models could represent text sequences in a plug-and-play way due to their parameter-free property. In this paper, a modified hierarchical pooling strategy over pre-trained word embeddings is proposed for text classification in a few-shot transfer learning way. The model leverages and transfers knowledge obtained from some source domains to recognize and classify the unseen text sequences with just a handful of support examples in the target problem domain. The extensive experiments on five datasets including both English and Chinese text demonstrate that the simple word embedding-based models (SWEMs) with parameter-free pooling operations are able to abstract and represent the semantic text. The proposed modified hierarchical pooling method exhibits significant classification performance in the few-shot transfer learning tasks compared with other alternative methods.

Details

ISSN :
21693536
Volume :
7
Database :
OpenAIRE
Journal :
IEEE Access
Accession number :
edsair.doi.dedup.....76bac5588efacce74da82421b611f21a