1. Sparse representation optimization of image Gaussian mixture features based on a convolutional neural network
- Author
-
Zhangquan Wang, Tiaojuan Ren, Fangfang Ye, and Ting Wang
- Subjects
Optimization problem ,Kernel (image processing) ,Artificial Intelligence ,Computer science ,Inpainting ,Sparse approximation ,Overfitting ,Mixture model ,Convolutional neural network ,Algorithm ,Software ,Dropout (neural networks) - Abstract
This paper analyzes the inherent relationship between convolutional neural networks and sparse representation and proposes an improved convolutional neural network model for image synthesis in response to problems with current methods. In the testing phase, the calculation of the sparse coefficients involves the solution of complex optimization problems, which greatly reduce the operating efficiency, inspired by the successful application of convolutional neural networks in the field of image reconstruction. Compared with the traditional image portrait synthesis method, this model not only has an end-to-end closed form but also does not need to solve complex optimization problems in the synthesis stage. The synthesis experiment on an image dataset shows that this method not only improves the synthesis effect but also improves the efficiency of the traditional method by one to two orders of magnitude, demonstrating its potential application value. Blocking processing is a common method for sparse domain image modeling. It improves the computational efficiency but also decreases the global structure of the image, which is difficult to compensate for through the aggregation and overlap of image blocks. In response to this problem, this paper proposes a low-rank image inpainting method based on a Gaussian mixture model. This method embeds the local statistical characteristics of image blocks into the kernel norm model and not only uses the Gaussian mixture model to maintain the local details of the image but also describes the global low-rank structure of the image through the kernel norm, thus restoring a class of image data with a potential low-rank structure and theoretically revealing the structured sparse nature of the Gaussian mixture model. This paper optimizes the strategy based on random hidden neuron nodes and proposes a dropout anti-overfitting strategy based on sparsity. The experiments show that this strategy can effectively improve the convergence speed while ensuring good performance and can effectively prevent overfitting.
- Published
- 2021
- Full Text
- View/download PDF