Back to Search Start Over

Crack Segmentation Network using Additive Attention Gate—CSN-II.

Authors :
Ali, Raza
Chuah, Joon Huang
Talip, Mohamad Sofian Abu
Mokhtar, Norrima
Shoaib, Muhammad Ali
Source :
Engineering Applications of Artificial Intelligence. Sep2022, Vol. 114, pN.PAG-N.PAG. 1p.
Publication Year :
2022

Abstract

One of the emerging and powerful tools of Artificial Intelligence (AI) in computer vision is Convolutional Neural Network (CNN) which can outperform traditional algorithms for crack detection by extracting unique image features. The segmentation of crack images is intensively affected by the imbalanced presence of crack and non-crack elements. Tackling the influence of class imbalanced datasets on the training network, we proposed an additive attention gate-based network architecture called Crack Segmentation Network-II (CSN-II). CSN-II has fewer encoder–decoder blocks with improved accuracy and reduced computational cost as compared to other crack segmentation network architectures. An additive attention gate is used as a connecting block between the encoder–decoder section of CSN-II that focuses on significant crack regions in the image. The network performance is evaluated on two different crack image datasets i.e., MSCI (500 images) and CFD (118 images). The experimental results showed that the CSN-II architecture using a local balanced cross-entropy (LBCE) loss function has achieved 98.48 % and 94.39 % mean accuracy for MSCI and CFD dataset, respectively. Furthermore, extensive research experiments are performed on MSCI and CFD datasets to delineate the best combination of five network architectures (U-Net, SegNet, DeepLabv3+, CSN, and CSN-II) and twelve loss functions for crack segmentation to observe the efficiency for tackling imbalanced dataset. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09521976
Volume :
114
Database :
Academic Search Index
Journal :
Engineering Applications of Artificial Intelligence
Publication Type :
Academic Journal
Accession number :
158389692
Full Text :
https://doi.org/10.1016/j.engappai.2022.105130