Back to Search
Start Over
Multi-Task Attentive Residual Networks for Argument Mining
- Source :
- IEEE/ACM Transactions on Audio, Speech, and Language Processing, vol 31, pp 1877-1892, 2023
- Publication Year :
- 2021
-
Abstract
- We explore the use of residual networks and neural attention for multiple argument mining tasks. We propose a residual architecture that exploits attention, multi-task learning, and makes use of ensemble, without any assumption on document or argument structure. We present an extensive experimental evaluation on five different corpora of user-generated comments, scientific publications, and persuasive essays. Our results show that our approach is a strong competitor against state-of-the-art architectures with a higher computational footprint or corpus-specific design, representing an interesting compromise between generality, performance accuracy and reduced model size.<br />Comment: 16 pages, 3 figures
Details
- Database :
- arXiv
- Journal :
- IEEE/ACM Transactions on Audio, Speech, and Language Processing, vol 31, pp 1877-1892, 2023
- Publication Type :
- Report
- Accession number :
- edsarx.2102.12227
- Document Type :
- Working Paper
- Full Text :
- https://doi.org/10.1109/TASLP.2023.3275040