Back to Search Start Over

Learning language variations in news corpora through differential embeddings

Authors :
Selmo, Carlos
Martinez, Julian F.
Beiró, Mariano G.
Alvarez-Hamelin, J. Ignacio
Publication Year :
2020

Abstract

There is an increasing interest in the NLP community in capturing variations in the usage of language, either through time (i.e., semantic drift), across regions (as dialects or variants) or in different social contexts (i.e., professional or media technolects). Several successful dynamical embeddings have been proposed that can track semantic change through time. Here we show that a model with a central word representation and a slice-dependent contribution can learn word embeddings from different corpora simultaneously. This model is based on a star-like representation of the slices. We apply it to The New York Times and The Guardian newspapers, and we show that it can capture both temporal dynamics in the yearly slices of each corpus, and language variations between US and UK English in a curated multi-source corpus. We provide an extensive evaluation of this methodology.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2011.06949
Document Type :
Working Paper