Back to Search Start Over

GePpeTto Carves Italian into a Language Model

Authors :
Mattei, L.
Cafagna, M.
FELICE DELLORLETTA
Nissim, M.
Guerini, M.
Source :
Scopus-Elsevier
Publication Year :
2021
Publisher :
Accademia University Press, 2021.

Abstract

In the last few years, pre-trained neural architectures have provided impressive improvements across several NLP tasks. Still, generative language models are available mainly for English. We develop GePpeTto, the first generative language model for Italian, built using the GPT-2 architecture. We provide a thorough analysis of GePpeTto's quality by means of both an automatic and a human-based evaluation. The automatic assessment consists in (i) calculating perplexity across different genres and (ii) a profiling analysis over GePpeTto's writing characteristics. We find that GePpeTto's production is a sort of bonsai version of human production, with shorter but yet complex sentences. Human evaluation is performed over a sentence completion task, where GePpeTto's output is judged as natural more often than not, and much closer to the original human texts than to a simpler language model which we take as baseline.

Details

Language :
English
Database :
OpenAIRE
Journal :
Scopus-Elsevier
Accession number :
edsair.doi.dedup.....4af63c970d65b6fd7cc7a5db1bc829d4