Back to Search
Start Over
Federated Learning for Sarcasm Detection: A Study of Attention-Enhanced BILSTM, GRU, and LSTM Architectures
- Source :
- IEEE Access, Vol 12, Pp 196786-196802 (2024)
- Publication Year :
- 2024
- Publisher :
- IEEE, 2024.
-
Abstract
- The previous centralized machine learning methods required combining large amounts of personal data on central servers, raising privacy concerns. Federated Learning (FL) presents a solution by training the models directly on users’ devices, thus preserving data privacy. This paper proposes the use of FL in sarcasm detection and applying neural network architectures such as Bidirectional Long Short-Term Memory (BILSTM), Gated Recurrent Unit (GRU), and Long Short-Term Memory (LSTM). The experiments have been performed on multiple clients. The results have shown that BILSTM outperforms GRU and LSTM in terms of accuracy, precision, recall, and F1 score. This makes it the most effective model for sarcasm detection in a FL. Apart from evaluating these models, this study also examined how their functionality may be further improved by integrating them with attention processes. The current study results show that attention-based models can significantly improve performance by focusing on the input parts that are more contextually relevant. Based on this work, future research could focus on improving the performance of these models and exploring hybrid approaches that combine the benefits of LSTM, GRU, and BILSTM.
Details
- Language :
- English
- ISSN :
- 21693536
- Volume :
- 12
- Database :
- Directory of Open Access Journals
- Journal :
- IEEE Access
- Publication Type :
- Academic Journal
- Accession number :
- edsdoj.8d027e568a88492c96de4bf8ea67edf3
- Document Type :
- article
- Full Text :
- https://doi.org/10.1109/ACCESS.2024.3520659