Back to Search Start Over

Composition of Sentence Embeddings:Lessons from Statistical Relational Learning

Authors :
Sileo, Damien
Van-De-Cruys, Tim
Pradel, Camille
Muller, Philippe
Publication Year :
2019

Abstract

Various NLP problems -- such as the prediction of sentence similarity, entailment, and discourse relations -- are all instances of the same general task: the modeling of semantic relations between a pair of textual elements. A popular model for such problems is to embed sentences into fixed size vectors, and use composition functions (e.g. concatenation or sum) of those vectors as features for the prediction. At the same time, composition of embeddings has been a main focus within the field of Statistical Relational Learning (SRL) whose goal is to predict relations between entities (typically from knowledge base triples). In this article, we show that previous work on relation prediction between texts implicitly uses compositions from baseline SRL models. We show that such compositions are not expressive enough for several tasks (e.g. natural language inference). We build on recent SRL models to address textual relational problems, showing that they are more expressive, and can alleviate issues from simpler compositions. The resulting models significantly improve the state of the art in both transferable sentence representation learning and relation prediction.<br />Comment: Camera-ready for *SEM 2019

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1904.02464
Document Type :
Working Paper