Back to Search Start Over

Fallback Variable History NNLMs: Efficient NNLMs by precomputation and stochastic training

Authors :
Universitat Politècnica de València. Departamento de Sistemas Informáticos y Computación - Departament de Sistemes Informàtics i Computació
Agencia Estatal de Investigación
Zamora Martínez, Francisco Julián
España Boquera, Salvador
Castro-Bleda, Maria Jose
Palacios Corella
Universitat Politècnica de València. Departamento de Sistemas Informáticos y Computación - Departament de Sistemes Informàtics i Computació
Agencia Estatal de Investigación
Zamora Martínez, Francisco Julián
España Boquera, Salvador
Castro-Bleda, Maria Jose
Palacios Corella
Publication Year :
2018

Abstract

[EN] This paper presents a new method to reduce the computational cost when using Neural Networks as Language Models, during recognition, in some particular scenarios. It is based on a Neural Network that considers input contexts of different length in order to ease the use of a fallback mechanism together with the precomputation of softmax normalization constants for these inputs. The proposed approach is empirically validated, showing their capability to emulate lower order N-grams with a single Neural Network. A machine translation task shows that the proposed model constitutes a good solution to the normalization cost of the output softmax layer of Neural Networks, for some practical cases, without a significant impact in performance while improving the system speed.

Details

Database :
OAIster
Notes :
TEXT, English
Publication Type :
Electronic Resource
Accession number :
edsoai.on1138453441
Document Type :
Electronic Resource