Back to Search Start Over

BARThez: a Skilled Pretrained French Sequence-to-Sequence Model

Authors :
Eddine, Moussa Kamal
Tixier, Antoine J. -P.
Vazirgiannis, Michalis
Publication Year :
2020

Abstract

Inductive transfer learning has taken the entire NLP field by storm, with models such as BERT and BART setting new state of the art on countless NLU tasks. However, most of the available models and research have been conducted for English. In this work, we introduce BARThez, the first large-scale pretrained seq2seq model for French. Being based on BART, BARThez is particularly well-suited for generative tasks. We evaluate BARThez on five discriminative tasks from the FLUE benchmark and two generative tasks from a novel summarization dataset, OrangeSum, that we created for this research. We show BARThez to be very competitive with state-of-the-art BERT-based French language models such as CamemBERT and FlauBERT. We also continue the pretraining of a multilingual BART on BARThez' corpus, and show our resulting model, mBARThez, to significantly boost BARThez' generative performance. Code, data and models are publicly available.<br />Comment: More experiments and results, human evaluation, reorganization of paper

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2010.12321
Document Type :
Working Paper