Back to Search
Start Over
Analyzing how context size and symmetry influence word embedding information
- Publication Year :
- 2022
-
Abstract
- Treball de fi de màster en Lingüística Teòrica i Aplicada. Director: Dr. Thomas Brochhagen Word embeddings represent word meaning in the form of a vector; however, the encoded information varies depending on the parameters the vector has been trained with. This paper analyzes how two parameters, context size and symmetry, influence word embedding information and aims to find if there exists a single distributional parametrization for capturing semantic similarity as well as relatedness. The models were trained with GloVe with different parametrizations; then, they were quantitatively evaluated through a similarity task, using WordSim-353 (for relatedness) and SimLex-999 (for semantic similarity) as benchmarks. The results show a minimal variation when manipulating some of the analyzed parameters, in particular between symmetric and asymmetric contexts, which leads us to conclude that it is not necessary to train models with large contexts for achieving good performance.
Details
- Language :
- English
- Database :
- OpenAIRE
- Accession number :
- edsair.od......1610..b9eb15262ed2d1f34247a4cac36031a6