Back to Search Start Over

Dynamically Scaled Temperature in Self-Supervised Contrastive Learning

Authors :
Manna, Siladittya
Chattopadhyay, Soumitri
Dey, Rakesh
Bhattacharya, Saumik
Pal, Umapada
Publication Year :
2023

Abstract

In contemporary self-supervised contrastive algorithms like SimCLR, MoCo, etc., the task of balancing attraction between two semantically similar samples and repulsion between two samples of different classes is primarily affected by the presence of hard negative samples. While the InfoNCE loss has been shown to impose penalties based on hardness, the temperature hyper-parameter is the key to regulating the penalties and the trade-off between uniformity and tolerance. In this work, we focus our attention on improving the performance of InfoNCE loss in self-supervised learning by proposing a novel cosine similarity dependent temperature scaling function to effectively optimize the distribution of the samples in the feature space. We also provide mathematical analyses to support the construction of such a dynamically scaled temperature function. Experimental evidence shows that the proposed framework outperforms the contrastive loss-based SSL algorithms.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2308.01140
Document Type :
Working Paper