Back to Search
Start Over
Text clustering based on pre-trained models and autoencoders.
- Source :
- Frontiers in Computational Neuroscience; 2024, p01-13, 13p
- Publication Year :
- 2024
-
Abstract
- Text clustering is the task of grouping text data based on similarity, and it holds particular importance in the medical field. sIn healthcare, medical data clustering is a highly active and e ective research area. It not only provides strong support for making correct medical decisions from medical datasets but also aids in patient record management and medical information retrieval. With the development of the healthcare industry, a large amount of medical data is being generated, and traditional medical data clustering faces significant challenges. Many existing text clustering algorithms are primarily based on the bag-of-words model, which has issues such as high dimensionality, sparsity, and the neglect of word positions and context. Pre-trained models are a deep learning-based approach that treats text as a sequence to accurately captureword positions and context information. Moreover, compared to traditional K-means and fuzzy C-means clustering models, deep learning-based clustering algorithms are better at handling high-dimensional, complex, and nonlinear data. In particular, clustering algorithms based on autoencoders can learn data representations and clustering information, e ectively reducing noise interference and errors during the clustering process. This paper combines pre-trained language models with deep embedding clustering models. Experimental results demonstrate that our model performs exceptionally well on four public datasets, outperforming most existing text clustering algorithms, and can be applied to medical data clustering. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 16625188
- Database :
- Complementary Index
- Journal :
- Frontiers in Computational Neuroscience
- Publication Type :
- Academic Journal
- Accession number :
- 174902033
- Full Text :
- https://doi.org/10.3389/fncom.2023.1334436