Back to Search
Start Over
Dynamically adaptive adjustment loss function biased towards few‐class learning
- Source :
- IET Image Processing, Vol 17, Iss 2, Pp 627-635 (2023)
- Publication Year :
- 2023
- Publisher :
- Wiley, 2023.
-
Abstract
- Abstract Convolution neural networks have been widely used in the field of computer vision, which effectively solve practical problems. However, the loss function with fixed parameters will affect the training efficiency and even lead to poor prediction accuracy. In particular, when there is a class imbalance in the data, the final result tends to favor the large‐class. In detection and recognition problems, the large‐class will dominate due to its quantitative advantage, and the features of few‐class can be not fully learned. In order to learn few‐class, batch nuclear‐norm maximization is introduced to the deep neural networks, and the mechanism of the adaptive composite loss function is established to increase the diversity of the network and thus improve the accuracy of prediction. The proposed loss function is added to the crowd counting, and verified on ShanghaiTech and UCF_CC_50 datasets. Experimental results show that the proposed loss function improves the prediction accuracy and convergence speed of deep neural networks.
- Subjects :
- Photography
TR1-1050
Computer software
QA76.75-76.765
Subjects
Details
- Language :
- English
- ISSN :
- 17519667 and 17519659
- Volume :
- 17
- Issue :
- 2
- Database :
- Directory of Open Access Journals
- Journal :
- IET Image Processing
- Publication Type :
- Academic Journal
- Accession number :
- edsdoj.7d83992ebe94addacde5437320fb640
- Document Type :
- article
- Full Text :
- https://doi.org/10.1049/ipr2.12661