Back to Search Start Over

An Improved Convolutional Neural Network-Based Scene Image Recognition Method.

Authors :
Wang P
Qiao J
Liu N
Source :
Computational intelligence and neuroscience [Comput Intell Neurosci] 2022 Jun 29; Vol. 2022, pp. 3464984. Date of Electronic Publication: 2022 Jun 29 (Print Publication: 2022).
Publication Year :
2022

Abstract

To solve the problems existing in the research of scene recognition, this paper studies a new convolutional neural network target detection model to achieve a better balance between the accuracy and speed of high-speed scene image recognition. First, aiming at the problem that the image is easy to be disturbed by impurities and poor quality in fine-grained image recognition, a preprocessing method based on the Canny edge detection is designed and the Canny operator is introduced to process the gray image. Second, the L2 regularization algorithm is used to optimize the basic network framework of the convolutional neural network, enhance the stability of the model in a complex environment, improve the generalization ability of the model, and improve the recognition accuracy of the algorithm to a certain extent. Finally, by collecting the campus environment datasets under different environmental conditions, the location recognition experiment and heat map visualization experiment are carried out. Experiments show that compared with the basic convolution neural network algorithm, the algorithm has better recognition performance and good generalization ability. The research of this study realizes the effective combination of multiframe convolution neural network and batch normalization algorithm and has a good practical effect on scene image recognition.<br />Competing Interests: The authors declare that they have no conflicts of interest.<br /> (Copyright © 2022 Pinhe Wang et al.)

Details

Language :
English
ISSN :
1687-5273
Volume :
2022
Database :
MEDLINE
Journal :
Computational intelligence and neuroscience
Publication Type :
Academic Journal
Accession number :
35814559
Full Text :
https://doi.org/10.1155/2022/3464984