1. Return of the RNN: Residual Recurrent Networks for Invertible Sentence Embeddings
- Author
-
Wilkerson, Jeremy
- Subjects
Computer Science - Computation and Language ,Computer Science - Machine Learning - Abstract
This study presents a novel model for invertible sentence embeddings using a residual recurrent network trained on an unsupervised encoding task. Rather than the probabilistic outputs common to neural machine translation models, our approach employs a regression-based output layer to reconstruct the input sequence's word vectors. The model achieves high accuracy and fast training with the ADAM optimizer, a significant finding given that RNNs typically require memory units, such as LSTMs, or second-order optimization methods. We incorporate residual connections and introduce a "match drop" technique, where gradients are calculated only for incorrect words. Our approach demonstrates potential for various natural language processing applications, particularly in neural network-based systems that require high-quality sentence embeddings., Comment: Adds descriptions of the use of dropout, the use of custom C++ code, the removal of non-English sentences, other minor changes
- Published
- 2023