Back to Search
Start Over
Discontinuous grammar as a foreign language.
- Source :
-
Neurocomputing . Mar2023, Vol. 524, p43-58. 16p. - Publication Year :
- 2023
-
Abstract
- • Sequence-to-sequence constituent parsers are lagging behind task-specific techniques. • We notably improve their accuracy by introducing a powerful neural architecture. • For the first time, we extend their coverage to deal with discontinuous constituent parsing. • Our text-to-parse transducer achieves a competitive performance on the main benchmarks. In order to achieve deep natural language understanding, syntactic constituent parsing is a vital step, highly demanded by many artificial intelligence systems to process both text and speech. One of the most recent proposals is the use of standard sequence-to-sequence models to perform constituent parsing as a machine translation task, instead of applying task-specific parsers. While they show a competitive performance, these text-to-parse transducers are still lagging behind classic techniques in terms of accuracy, coverage and speed. To close the gap, we here extend the framework of sequence-to-sequence models for constituent parsing, not only by providing a more powerful neural architecture for improving their performance, but also by enlarging their coverage to handle the most complex syntactic phenomena: discontinuous structures. To that end, we design several novel linearizations that can fully produce discontinuities and, for the first time, we test a sequence-to-sequence model on the main discontinuous benchmarks, obtaining competitive results on par with task-specific discontinuous constituent parsers and achieving state-of-the-art scores on the (discontinuous) English Penn Treebank. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 09252312
- Volume :
- 524
- Database :
- Academic Search Index
- Journal :
- Neurocomputing
- Publication Type :
- Academic Journal
- Accession number :
- 161401848
- Full Text :
- https://doi.org/10.1016/j.neucom.2022.12.045