1. Enhancing super resolution of oil painting patterns through optimization of UNet architecture model.
- Author
-
Chang, Yun
- Subjects
- *
CONVOLUTIONAL neural networks , *OIL transfer operations , *STANDARD deviations - Abstract
Precise and content-aware style transfer in oil painting remains a significant research focus. This involves preserving the semantic content of the original image while accurately emulating the desired painting style. Developing robust algorithms for accurate recognition and classification of various patterns within oil paintings remains a challenge which identify and categorize styles, textures and visual elements to contribute comprehensive understanding of the artwork. The conventional image style transfer algorithm mainly relies on mathematical modeling to characterize image texture facilitating the combination of a content image with a style image to realize style transfer effects. These algorithms neglect the distribution of image edges leading to blurred contours in the resulting images. To address these limitations, this paper proposed a novel decoration pattern style migration algorithm based on SRCNN (Super Resolution Convolutional Neural Network) model using UNet-based architecture. The algorithm leverages the Laplace operator for image sharpening by accentuating edge distributions. Downsampling techniques are used to generate low-resolution images mitigating the protracted duration of image style transfer iterations. The incorporation of Gaussian sampling and a parallelization algorithm contributes to the efficacy of this method in facilitating the transfer of oil painting styles. The proposed model achieves better results using Facades data set, i.e., variance (6.35%), standard deviation (8.67%), uncertainty (11.21%), SSIM score (94.45%), PSNR score (40.24%), MSE score (6.21%) and accuracy (95.00%). For the Monet2photo data set, i.e., variance (7.25%), standard deviation (9.78%), uncertainty (12.78%), SSIM score (91.69%), PSNR score (43.71%), MSE score (8.67%) and accuracy (92.00%). Our proposed method outperforms classical methods using both data sets. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF