Back to Search Start Over

Similarity contrastive estimation for selfs-supervised soft contrastive learning

Authors :
Denize, Julien
Rabarisoa, Jaonary
Orcesi, Astrid
Hérault, Romain
Canu, Stéphane
Département Intelligence Ambiante et Systèmes Interactifs (DIASI)
Laboratoire d'Intégration des Systèmes et des Technologies (LIST (CEA))
Direction de Recherche Technologique (CEA) (DRT (CEA))
Commissariat à l'énergie atomique et aux énergies alternatives (CEA)-Commissariat à l'énergie atomique et aux énergies alternatives (CEA)-Direction de Recherche Technologique (CEA) (DRT (CEA))
Commissariat à l'énergie atomique et aux énergies alternatives (CEA)-Commissariat à l'énergie atomique et aux énergies alternatives (CEA)-Université Paris-Saclay
Laboratoire d'Informatique, du Traitement de l'Information et des Systèmes (LITIS)
Université Le Havre Normandie (ULH)
Normandie Université (NU)-Normandie Université (NU)-Université de Rouen Normandie (UNIROUEN)
Normandie Université (NU)-Institut national des sciences appliquées Rouen Normandie (INSA Rouen Normandie)
Institut National des Sciences Appliquées (INSA)-Normandie Université (NU)-Institut National des Sciences Appliquées (INSA)
IEEE Computer Society
Source :
2023 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), Waikoloa, HI, USA, 2023, WACV 2023-2023 IEEE/CVF Winter Conference on Applications of Computer Vision, WACV 2023-2023 IEEE/CVF Winter Conference on Applications of Computer Vision, IEEE Computer Society, Jan 2023, Hawaii, United States. pp.2705-2715, ⟨10.1109/WACV56688.2023.00273⟩
Publication Year :
2023
Publisher :
HAL CCSD, 2023.

Abstract

Contrastive representation learning has proven to be an effective self-supervised learning method. Most successful approaches are based on Noise Contrastive Estimation (NCE) and use different views of an instance as positives that should be contrasted with other instances, called negatives, that are considered as noise. However, several instances in a dataset are drawn from the same distribution and share underlying semantic information. A good data representation should contain relations, or semantic similarity, between the instances. Contrastive learning implicitly learns relations but considering all negatives as noise harms the quality of the learned relations. To circumvent this issue, we propose a novel formulation of contrastive learning using semantic similarity between instances called Similarity Contrastive Estimation (SCE). Our training objective is a soft contrastive learning one. Instead of hard classifying positives and negatives, we estimate from one view of a batch a continuous distribution to push or pull instances based on their semantic similarities. This target similarity distribution is sharpened to eliminate noisy relations. The model predicts for each instance, from another view, the target distribution while contrasting its positive with negatives. Experimental results show that SCE is Top-1 on the ImageNet linear evaluation protocol at 100 pretraining epochs with 72.1% accuracy and is competitive with state-of-the-art algorithms by reaching 75.4% for 200 epochs with multi-crop. We also show that SCE is able to generalize to several tasks. Source code is available here: https://github.com/CEA-LIST/SCE.<br />Comment: Accepted to IEEE Winter Conference on Applications of Computer Vision (WACV) 2023

Details

Language :
English
Database :
OpenAIRE
Journal :
2023 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), Waikoloa, HI, USA, 2023, WACV 2023-2023 IEEE/CVF Winter Conference on Applications of Computer Vision, WACV 2023-2023 IEEE/CVF Winter Conference on Applications of Computer Vision, IEEE Computer Society, Jan 2023, Hawaii, United States. pp.2705-2715, ⟨10.1109/WACV56688.2023.00273⟩
Accession number :
edsair.doi.dedup.....f8ee9d9d151debd6425487cc7302fe99