Back to Search Start Over

Why and when should you pool? Analyzing Pooling in Recurrent Architectures

Authors :
Maini, Pratyush
Kolluru, Keshav
Pruthi, Danish
Mausam
Maini, Pratyush
Kolluru, Keshav
Pruthi, Danish
Mausam
Publication Year :
2020

Abstract

Pooling-based recurrent neural architectures consistently outperform their counterparts without pooling. However, the reasons for their enhanced performance are largely unexamined. In this work, we examine three commonly used pooling techniques (mean-pooling, max-pooling, and attention), and propose max-attention, a novel variant that effectively captures interactions among predictive tokens in a sentence. We find that pooling-based architectures substantially differ from their non-pooling equivalents in their learning ability and positional biases--which elucidate their performance benefits. By analyzing the gradient propagation, we discover that pooling facilitates better gradient flow compared to BiLSTMs. Further, we expose how BiLSTMs are positionally biased towards tokens in the beginning and the end of a sequence. Pooling alleviates such biases. Consequently, we identify settings where pooling offers large benefits: (i) in low resource scenarios, and (ii) when important words lie towards the middle of the sentence. Among the pooling techniques studied, max-attention is the most effective, resulting in significant performance gains on several text classification tasks.<br />Comment: Accepted to Findings of EMNLP 2020, to be presented at BlackBoxNLP. Updated Version

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1228405114
Document Type :
Electronic Resource