Back to Search Start Over

Lacking the embedding of a word? Look it up into a traditional dictionary

Authors :
Ruzzetti, Elena Sofia
Ranaldi, Leonardo
Mastromattei, Michele
Fallucchi, Francesca
Zanzotto, Fabio Massimo
Source :
Findings of the Association for Computational Linguistics: ACL 2022
Publication Year :
2021

Abstract

Word embeddings are powerful dictionaries, which may easily capture language variations. However, these dictionaries fail to give sense to rare words, which are surprisingly often covered by traditional dictionaries. In this paper, we propose to use definitions retrieved in traditional dictionaries to produce word embeddings for rare words. For this purpose, we introduce two methods: Definition Neural Network (DefiNNet) and Define BERT (DefBERT). In our experiments, DefiNNet and DefBERT significantly outperform state-of-the-art as well as baseline methods devised for producing embeddings of unknown words. In fact, DefiNNet significantly outperforms FastText, which implements a method for the same task-based on n-grams, and DefBERT significantly outperforms the BERT method for OOV words. Then, definitions in traditional dictionaries are useful to build word embeddings for rare words.

Details

Database :
arXiv
Journal :
Findings of the Association for Computational Linguistics: ACL 2022
Publication Type :
Report
Accession number :
edsarx.2109.11763
Document Type :
Working Paper
Full Text :
https://doi.org/10.18653/v1/2022.findings-acl.208