Back to Search Start Over

Natural Language Processing with Small Feed-Forward Networks

Authors :
Botha, Jan A.
Pitler, Emily
Ma, Ji
Bakalov, Anton
Salcianu, Alex
Weiss, David
McDonald, Ryan
Petrov, Slav
Publication Year :
2017

Abstract

We show that small and shallow feed-forward neural networks can achieve near state-of-the-art results on a range of unstructured and structured language processing tasks while being considerably cheaper in memory and computational requirements than deep recurrent models. Motivated by resource-constrained environments like mobile phones, we showcase simple techniques for obtaining such small neural network models, and investigate different tradeoffs when deciding how to allocate a small memory budget.<br />Comment: EMNLP 2017 short paper

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1708.00214
Document Type :
Working Paper