Back to Search Start Over

The Dark Side of the Language: Pre-trained Transformers in the DarkNet

Authors :
Ranaldi, Leonardo
Nourbakhsh, Aria
Patrizi, Arianna
Ruzzetti, Elena Sofia
Onorati, Dario
Fallucchi, Francesca
Zanzotto, Fabio Massimo
Publication Year :
2022

Abstract

Pre-trained Transformers are challenging human performances in many natural language processing tasks. The gigantic datasets used for pre-training seem to be the key for their success on existing tasks. In this paper, we explore how a range of pre-trained natural language understanding models perform on truly novel and unexplored data, provided by classification tasks over a DarkNet corpus. Surprisingly, results show that syntactic and lexical neural networks largely outperform pre-trained Transformers. This seems to suggest that pre-trained Transformers have serious difficulties in adapting to radically novel texts.

Details

Language :
English
Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....28bb5d46cca369115c8060d87a897f80