Back to Search Start Over

An Adaptive Sentence Representation Learning Model Based on Multi-gram CNN

Authors :
Chunyun Zhang
Baolin Zhao
Lu Yang
Chaoran Cui
Xi Xiaoming
Yin Yilong
Sheng Gao
Source :
Intelligent Environments
Publication Year :
2017
Publisher :
IEEE, 2017.

Abstract

Nature Language Processing has been paid more attention recently. Traditional approaches for language model primarily rely on elaborately designed features and complicated natural language processing tools, which take a large amount of human effort and are prone to error propagation and data sparse problem. Deep neural network method has been shown to be able to learn implicit semantics of text without extra knowledge. To better learn deep underlying semantics of sentences, most deepneuralnetworklanguagemodelsutilizemulti-gramstrategy. However, the current multi-gram strategies in CNN framework are mostly realized by concatenating trained multi-gram vectors to form the sentence vector, which can increase the number of parameters to be learned and is prone to over fitting. To alleviate the problem mentioned above, we propose a novel adaptive sentence representation learning model based on multigram CNN framework. It learns adaptive importance weights of different n-gram features and forms sentence representation by using weighted sum operation on extracted n-gram features, which can largely reduce parameters to be learned and alleviate the threat of over fitting. Experimental results show that the proposed method can improve performances when be used in sentiment and relation classification tasks.

Details

Database :
OpenAIRE
Journal :
2017 International Conference on Intelligent Environments (IE)
Accession number :
edsair.doi...........8caf2b9ab2775464721cef256eed4e34
Full Text :
https://doi.org/10.1109/ie.2017.18