Back to Search
Start Over
Text classification by untrained sentence embeddings.
- Source :
-
Intelligenza Artificiale . 2020, Vol. 14 Issue 2, p245-259. 15p. - Publication Year :
- 2020
-
Abstract
- Recurrent Neural Networks (RNNs) represent a natural paradigm for modeling sequential data like text written in natural language. In fact, RNNs and their variations have long been the architecture of choice in many applications, however in practice they require the use of labored architectures (such as gating mechanisms) and computationally heavy training processes. In this paper we address the question of whether it is possible to generate sentence embeddings via completely untrained recurrent dynamics, on top of which to apply a simple learning algorithm for text classification. This would allow to obtain extremely efficient models in terms of training time. Our work investigates the extent to which this approach can be used, by analyzing the results on different tasks. Finally, we show that, within certain limits, it is possible to build extremely efficient models for text classification that remain competitive in accuracy with reference models in the state-of-the-art. [ABSTRACT FROM AUTHOR]
- Subjects :
- *RECURRENT neural networks
*MACHINE learning
*CLASSIFICATION
Subjects
Details
- Language :
- English
- ISSN :
- 17248035
- Volume :
- 14
- Issue :
- 2
- Database :
- Academic Search Index
- Journal :
- Intelligenza Artificiale
- Publication Type :
- Academic Journal
- Accession number :
- 148072888
- Full Text :
- https://doi.org/10.3233/IA-200053