Back to Search
Start Over
Shift-Reduce Task-Oriented Semantic Parsing with Stack-Transformers.
- Source :
- Cognitive Computation; Nov2024, Vol. 16 Issue 6, p2846-2862, 17p
- Publication Year :
- 2024
-
Abstract
- Intelligent voice assistants, such as Apple Siri and Amazon Alexa, are widely used nowadays. These task-oriented dialogue systems require a semantic parsing module in order to process user utterances and understand the action to be performed. This semantic parsing component was initially implemented by rule-based or statistical slot-filling approaches for processing simple queries; however, the appearance of more complex utterances demanded the application of shift-reduce parsers or sequence-to-sequence models. Although shift-reduce approaches were initially considered the most promising option, the emergence of sequence-to-sequence neural systems has propelled them to the forefront as the highest-performing method for this particular task. In this article, we advance the research on shift-reduce semantic parsing for task-oriented dialogue. We implement novel shift-reduce parsers that rely on Stack-Transformers. This framework allows to adequately model transition systems on the transformer neural architecture, notably boosting shift-reduce parsing performance. Furthermore, our approach goes beyond the conventional top-down algorithm: we incorporate alternative bottom-up and in-order transition systems derived from constituency parsing into the realm of task-oriented parsing. We extensively test our approach on multiple domains from the Facebook TOP benchmark, improving over existing shift-reduce parsers and state-of-the-art sequence-to-sequence models in both high-resource and low-resource settings. We also empirically prove that the in-order algorithm substantially outperforms the commonly used top-down strategy. Through the creation of innovative transition systems and harnessing the capabilities of a robust neural architecture, our study showcases the superiority of shift-reduce parsers over leading sequence-to-sequence methods on the main benchmark. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 18669956
- Volume :
- 16
- Issue :
- 6
- Database :
- Complementary Index
- Journal :
- Cognitive Computation
- Publication Type :
- Academic Journal
- Accession number :
- 180735558
- Full Text :
- https://doi.org/10.1007/s12559-024-10339-4