Back to Search Start Over

Provable Stochastic Optimization for Global Contrastive Learning: Small Batch Does Not Harm Performance

Authors :
Yuan, Zhuoning
Wu, Yuexin
Qiu, Zi-Hao
Du, Xianzhi
Zhang, Lijun
Zhou, Denny
Yang, Tianbao
Publication Year :
2022

Abstract

In this paper, we study contrastive learning from an optimization perspective, aiming to analyze and address a fundamental issue of existing contrastive learning methods that either rely on a large batch size or a large dictionary of feature vectors. We consider a global objective for contrastive learning, which contrasts each positive pair with all negative pairs for an anchor point. From the optimization perspective, we explain why existing methods such as SimCLR require a large batch size in order to achieve a satisfactory result. In order to remove such requirement, we propose a memory-efficient Stochastic Optimization algorithm for solving the Global objective of Contrastive Learning of Representations, named SogCLR. We show that its optimization error is negligible under a reasonable condition after a sufficient number of iterations or is diminishing for a slightly different global contrastive objective. Empirically, we demonstrate that SogCLR with small batch size (e.g., 256) can achieve similar performance as SimCLR with large batch size (e.g., 8192) on self-supervised learning task on ImageNet-1K. We also attempt to show that the proposed optimization technique is generic and can be applied to solving other contrastive losses, e.g., two-way contrastive losses for bimodal contrastive learning. The proposed method is implemented in our open-sourced library LibAUC (www.libauc.org).<br />Comment: Accepted by ICML2022

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2202.12387
Document Type :
Working Paper