1. Recognition model for major depressive disorder in Arabic user-generated content.
- Author
-
Rabie, Esraa M., Hashem, Atef F., and Alsheref, Fahad Kamal
- Subjects
NATURAL language processing ,LONG short-term memory ,ARTIFICIAL intelligence ,TRANSFORMER models ,DEEP learning ,USER-generated content - Abstract
Background: One of the psychological problems that have become very prevalent in the modern world is depression, where mental health disorders have become very common. Depression, as reported by the WHO, is the second-largest factor in the worldwide burden of illnesses. As these issues grow, social media has become a tremendous platform for people to express themselves. A user's social media behavior may therefore disclose a lot about their emotional state and mental health. This research offers a novel framework for depression detection from Arabic textual data utilizing deep learning (DL), natural language processing (NLP), machine learning (ML), and BERT transformers techniques in light of the disease's high prevalence. To do this, a dataset of tweets was used, which was collected from 3 sources, as we mention later. The dataset was constructed in two variants, one with binary classification and the other with multi-classification. Results: In binary classifications, we used ML techniques such as "support vector machine (SVM), random forest (RF), logistic regression (LR), and Gaussian naive Bayes (GNB)," and used BERT transformers "ARABERT." In comparison ML with BERT transformers, ARABERT has high accuracy in binary classification with a 93.03 percent accuracy rate. In multi-classification, we used DL techniques such as "long short-term memory (LSTM)," and used BERT transformers "Multilingual BERT." In comparison DL with BERT transformers, multilingual has high accuracy in multi-classification with an accuracy of 97.8%. Conclusion: Through user-generated content, we can detect depressed people using artificial intelligence technology in a fast manner and with high accuracy instead of medical technology. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF