Back to Search Start Over

An Input Residual Connection for Simplifying Gated Recurrent Neural Networks

Authors :
Mehrtash Harandi
Hanna Suominen
Nicholas I-Hsien Kuo
Christian Walder
Gabriela Ferraro
Nicolas Fourrier
Source :
IJCNN
Publication Year :
2020
Publisher :
IEEE, 2020.

Abstract

Gated Recurrent Neural Networks (GRNNs) are important models that continue to push the state-of-the-art solutions across different machine learning problems. However, they are composed of intricate components that are generally not well understood. We increase GRNN interpretability by linking the canonical Gated Recurrent Unit (GRU) design to the well-studied Hopfield network. This connection allowed us to identify network redundancies, which we simplified with an Input Residual Connection (IRC). We tested GRNNs against their IRC counterparts on language modelling. In addition, we proposed an Input Highway Connection (IHC) as an advance application of the IRC and then evaluated the most widely applied GRNN of the Long Short-Term Memory (LSTM) and IHC-LSTM on tasks of i) image generation and ii) learning to learn to update another learner-network. Despite parameter reductions, all IRC-GRNNs showed either comparative or superior generalisation than their baseline models. Furthermore, compared to LSTM, the IHC-LSTM removed 85.4% parameters on image generation. In conclusion, the IRC is applicable, but not limited, to the GRNN designs of GRUs and LSTMs but also to FastGRNNs, Simple Recurrent Units (SRUs), and Strongly-Typed Recurrent Neural Networks (T-RNNs).

Details

Database :
OpenAIRE
Journal :
2020 International Joint Conference on Neural Networks (IJCNN)
Accession number :
edsair.doi...........3e21b2da947ac01071459aa17b743878
Full Text :
https://doi.org/10.1109/ijcnn48605.2020.9207238