Back to Search Start Over

LCTR: On Awakening the Local Continuity of Transformer for Weakly Supervised Object Localization

Authors :
Chen, Zhiwei
Wang, Changan
Wang, Yabiao
Jiang, Guannan
Shen, Yunhang
Tai, Ying
Wang, Chengjie
Zhang, Wei
Cao, Liujuan
Publication Year :
2021

Abstract

Weakly supervised object localization (WSOL) aims to learn object localizer solely by using image-level labels. The convolution neural network (CNN) based techniques often result in highlighting the most discriminative part of objects while ignoring the entire object extent. Recently, the transformer architecture has been deployed to WSOL to capture the long-range feature dependencies with self-attention mechanism and multilayer perceptron structure. Nevertheless, transformers lack the locality inductive bias inherent to CNNs and therefore may deteriorate local feature details in WSOL. In this paper, we propose a novel framework built upon the transformer, termed LCTR (Local Continuity TRansformer), which targets at enhancing the local perception capability of global features among long-range feature dependencies. To this end, we propose a relational patch-attention module (RPAM), which considers cross-patch information on a global basis. We further design a cue digging module (CDM), which utilizes local features to guide the learning trend of the model for highlighting the weak local responses. Finally, comprehensive experiments are carried out on two widely used datasets, ie, CUB-200-2011 and ILSVRC, to verify the effectiveness of our method.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2112.05291
Document Type :
Working Paper