Back to Search Start Over

A grayscale image enhancement algorithm based on dense residual and attention mechanism.

Authors :
Ye, Meng
Yang, Shi'en
He, Yujun
Peng, Zhangjun
Source :
Visual Computer; Mar2024, Vol. 40 Issue 3, p1983-1995, 13p
Publication Year :
2024

Abstract

Deep learning shows great potential in low-light image enhancement, which can improve image brightness and contrast while keeping image natural. However, due to the lack of prior knowledge extracted manually or excessive amplification of noise, these methods result in poor quality of enhanced images. To solve these challenging problems, this paper proposes a dual-branch grayscale image enhancement network based on dense residual and attention mechanism. Low-light grayscale image is used as input. Firstly, features that fuse deep and shallow information are extracted through dense residual convolution branch network; secondly, texture features are extracted through U-Net branch network combined with attention mechanism; then the extracted features are integrated, and finally luminance is adjusted by a brightness adjustment module to output an enhanced grayscale image. In addition, a joint loss function is designed to measure the network training loss from brightness, texture, contrast and noise. A large number of quantitative and qualitative experiments on LOL and VE-LOL datasets show that the proposed method improves Peak Signal to Noise Ratio Index and Structural Similarity Index by 19.65 - 59.76% and 5.61 - 85.53%, respectively, compared with EnlightenGAN, KinD++, RUAS, LLFlow, etc. The proposed method is superior to the most famous methods, thanks to the deep feature extraction and fusion capability of dense residual convolutional network and texture extraction capability of U-Net network combined with attention mechanism. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
01782789
Volume :
40
Issue :
3
Database :
Complementary Index
Journal :
Visual Computer
Publication Type :
Academic Journal
Accession number :
175459349
Full Text :
https://doi.org/10.1007/s00371-023-02896-w