Back to Search Start Over

RoBERTurk: Adjusting RoBERTa for Turkish

Authors :
Tas, Nuri
Publication Year :
2024

Abstract

We pretrain RoBERTa on a Turkish corpora using BPE tokenizer. Our model outperforms BERTurk family models on the BOUN dataset for the POS task while resulting in underperformance on the IMST dataset for the same task and achieving competitive scores on the Turkish split of the XTREME dataset for the NER task - all while being pretrained on smaller data than its competitors. We release our pretrained model and tokenizer.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2401.03515
Document Type :
Working Paper