Back to Search
Start Over
Neural Machine Translation with Phrasal Attention
- Source :
- Communications in Computer and Information Science ISBN: 9789811071331, CWMT
- Publication Year :
- 2017
- Publisher :
- Springer Singapore, 2017.
-
Abstract
- Attention-based neural machine translation (NMT) employs an attention network to capture structural correspondences between the source and target language at the word level. Unfortunately, alignments between source and target equivalents are complicated, which makes word-level attention not adequate to model these relations (e.g., alignments between a source idiom and its target translation). In order to handle this issue, we propose a phrase-level attention mechanism to complement the word-level attention network in this paper. The proposed phrasal attention framework is simple yet effective, keeping the strength of phrase-based statistical machine translation (SMT) on the source side. Experiments on Chinese-to-English translation task demonstrate that the proposed method is able to statistically improve word-level attention-based NMT.
- Subjects :
- Phrase
Machine translation
business.industry
Computer science
Speech recognition
computer.software_genre
Translation (geometry)
Task (project management)
Example-based machine translation
Recurrent neural network
Artificial intelligence
Complement (linguistics)
business
computer
Word (computer architecture)
Natural language processing
Subjects
Details
- ISBN :
- 978-981-10-7133-1
- ISBNs :
- 9789811071331
- Database :
- OpenAIRE
- Journal :
- Communications in Computer and Information Science ISBN: 9789811071331, CWMT
- Accession number :
- edsair.doi...........1e9d50465006364edf95bfa92dddea0b