Back to Search
Start Over
End-to-End Transformer-Based Models in Textual-Based NLP.
- Source :
- AI; Mar2023, Vol. 4 Issue 1, p54-110, 57p
- Publication Year :
- 2023
-
Abstract
- Transformer architectures are highly expressive because they use self-attention mechanisms to encode long-range dependencies in the input sequences. In this paper, we present a literature review on Transformer-based (TB) models, providing a detailed overview of each model in comparison to the Transformer's standard architecture. This survey focuses on TB models used in the field of Natural Language Processing (NLP) for textual-based tasks. We begin with an overview of the fundamental concepts at the heart of the success of these models. Then, we classify them based on their architecture and training mode. We compare the advantages and disadvantages of popular techniques in terms of architectural design and experimental value. Finally, we discuss open research, directions, and potential future work to help solve current TB application challenges in NLP. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 26732688
- Volume :
- 4
- Issue :
- 1
- Database :
- Complementary Index
- Journal :
- AI
- Publication Type :
- Academic Journal
- Accession number :
- 162724096
- Full Text :
- https://doi.org/10.3390/ai4010004