Back to Search Start Over

Restoring vision in hazy weather with hierarchical contrastive learning.

Authors :
Wang, Tao
Tao, Guangpin
Lu, Wanglong
Zhang, Kaihao
Luo, Wenhan
Zhang, Xiaoqin
Lu, Tong
Source :
Pattern Recognition. Jan2024, Vol. 145, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

Image restoration under hazy weather condition, which is called single image dehazing, has been of significant interest for various computer vision applications. In recent years, deep learning-based methods have achieved success. However, existing image dehazing methods typically neglect the hierarchy of features in the neural network and fail to exploit their relationships fully. To this end, we propose an effective image dehazing method named Hierarchical Contrastive Dehazing (HCD), which is based on feature fusion and contrastive learning strategies. HCD consists of a hierarchical dehazing network (HDN) and a novel hierarchical contrastive loss (HCL). Specifically, the core design in the HDN is a hierarchical interaction module, which utilizes multi-scale activation to revise the feature responses hierarchically. To cooperate with the training of HDN, we propose HCL which performs contrastive learning on hierarchically paired exemplars, facilitating haze removal. Extensive experiments on public datasets, RESIDE, HazeRD, and DENSE-HAZE, demonstrate that HCD quantitatively outperforms the state-of-the-art methods in terms of PSNR, SSIM and achieves better visual quality. • Our method employs contrastive learning to enhance feature representation ability. • Hierarchical interaction module allows information flow to exchange efficiently across different branches. • Hierarchical contrastive loss effectively guides the model to learn the valid features for image dehazing. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00313203
Volume :
145
Database :
Academic Search Index
Journal :
Pattern Recognition
Publication Type :
Academic Journal
Accession number :
172778097
Full Text :
https://doi.org/10.1016/j.patcog.2023.109956