Back to Search
Start Over
Highly-Inflected Language Generation Using Factored Language Models
- Source :
- Computational Linguistics and Intelligent Text Processing ISBN: 9783642193996, CICLing (1)
- Publication Year :
- 2011
- Publisher :
- Springer Berlin Heidelberg, 2011.
-
Abstract
- Statistical language models based on n-gram counts have been shown to successfully replace grammar rules in standard 2-stage (or 'generate-and-select') Natural Language Generation (NLG). In highlyinflected languages, however, the amount of training data required to cope with n-gram sparseness may be simply unobtainable, and the benefits of a statistical approach become less obvious. In this work we address the issue of text generation in a highly-inflected language by making use of factored language models (FLM) that take morphological information into account. We present a number of experiments involving the use of simple FLMs applied to various surface realisation tasks, showing that FLMs may implement 2-stage generation with results that are far superior to standard n-gram models alone.
- Subjects :
- Language identification
Computer science
Modeling language
business.industry
Object language
Natural language generation
Natural language programming
Specification language
computer.software_genre
Language primitive
Universal Networking Language
Cache language model
Factored language model
Language model
Artificial intelligence
business
computer
Natural language processing
Subjects
Details
- ISBN :
- 978-3-642-19399-6
- ISBNs :
- 9783642193996
- Database :
- OpenAIRE
- Journal :
- Computational Linguistics and Intelligent Text Processing ISBN: 9783642193996, CICLing (1)
- Accession number :
- edsair.doi...........c1abd95119083309bb78b7a12ab23036
- Full Text :
- https://doi.org/10.1007/978-3-642-19400-9_34