Back to Search Start Over

Guided-Generative Network for noise detection in Monte-Carlo rendering

Authors :
Jérôme Buisine
Fabien Teytaud
Samuel Delepoulle
christophe renaud
Université du Littoral Côte d'Opale (ULCO)
Laboratoire d'Informatique Signal et Image de la Côte d'Opale (LISIC)
ANR-17-CE38-0009,PrISE-3D,Perception Interaction et Simulation d'Eclairage 3D(2017)
Source :
International Conference On Machine Learning And Applications, International Conference On Machine Learning And Applications, Dec 2021, Pasadena, United States, HAL
Publication Year :
2021
Publisher :
HAL CCSD, 2021.

Abstract

International audience; Estimating the features to be extracted from an image for classification tasks are sometimes difficult, especially if images are related to a particular kind of noise. The aim of this paper is to propose a neural network architecture named Guided-Generative Network (GGN) to extract refined information that allows to correctly quantify the noise present in a sliding window of images. GNN tends to find the desired features to address such a problem in order to emit a detection criterion of this noise. The proposed GGN is applied on photorealistic images which are rendered by Monte-Carlo methods by evaluating a large number of samples per pixel. An insufficient number of samples per pixel tends to result in residual noise which is very noticeable to humans. This noise can be reduced by increasing the number of samples, as proven by Monte-Carlo theory, but this involves considerable computational time. Finding the right number of samples needed for human observers to perceive no noise is still an open problem. The results obtained show that GGN can correctly solve the problem without prior knowledge of the noise while being competitive with existing methods.

Details

Language :
English
Database :
OpenAIRE
Journal :
International Conference On Machine Learning And Applications, International Conference On Machine Learning And Applications, Dec 2021, Pasadena, United States, HAL
Accession number :
edsair.doi.dedup.....83c1755c8efe572b6c69280252d67067