Back to Search Start Over

HuggingFace's Transformers: State-of-the-art Natural Language Processing

Authors :
Wolf, Thomas
Debut, Lysandre
Sanh, Victor
Chaumond, Julien
Delangue, Clement
Moi, Anthony
Cistac, Pierric
Rault, Tim
Louf, Rémi
Funtowicz, Morgan
Davison, Joe
Shleifer, Sam
von Platen, Patrick
Ma, Clara
Jernite, Yacine
Plu, Julien
Xu, Canwen
Scao, Teven Le
Gugger, Sylvain
Drame, Mariama
Lhoest, Quentin
Rush, Alexander M.
Publication Year :
2019

Abstract

Recent progress in natural language processing has been driven by advances in both model architecture and model pretraining. Transformer architectures have facilitated building higher-capacity models and pretraining has made it possible to effectively utilize this capacity for a wide variety of tasks. \textit{Transformers} is an open-source library with the goal of opening up these advances to the wider machine learning community. The library consists of carefully engineered state-of-the art Transformer architectures under a unified API. Backing this library is a curated collection of pretrained models made by and available for the community. \textit{Transformers} is designed to be extensible by researchers, simple for practitioners, and fast and robust in industrial deployments. The library is available at \url{https://github.com/huggingface/transformers}.<br />Comment: 8 pages, 4 figures, more details at https://github.com/huggingface/transformers

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1910.03771
Document Type :
Working Paper