Back to Search Start Over

NeuBTF: Neural fields for BTF encoding and transfer.

Authors :
Rodriguez-Pardo, Carlos
Kazatzis, Konstantinos
Lopez-Moreno, Jorge
Garces, Elena
Source :
Computers & Graphics. Aug2023, Vol. 114, p239-246. 8p.
Publication Year :
2023

Abstract

Neural material representations are becoming a popular way to represent materials for rendering. They are more expressive than analytic models and occupy less memory than tabulated BTFs. However, existing neural materials are immutable, meaning that their output for a certain query of UVs, camera, and light vector is fixed once they are trained. While this is practical when there is no need to edit the material, it can become very limiting when the fragment of the material used for training is too small or not tileable, which frequently happens when the material has been captured with a gonioreflectometer. In this paper, we propose a novel neural material representation which jointly tackles the problems of BTF compression, tiling, and extrapolation. At test time, our method uses a guidance image as input to condition the neural BTF to the structural features of this input image. Then, the neural BTF can be queried as a regular BTF using UVs, camera, and light vectors. Every component in our framework is purposefully designed to maximize BTF encoding quality at minimal parameter count and computational complexity, achieving competitive compression rates compared with previous work. We demonstrate the results of our method on a variety of synthetic and captured materials, showing its generality and capacity to learn to represent many optical properties. • Neural BTF representations allow for efficient reflectance encoding. • Autoencoder models trained with data augmentation allow for material propagation. • Neural reflectance propagation enables the creation of tileable BTF. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00978493
Volume :
114
Database :
Academic Search Index
Journal :
Computers & Graphics
Publication Type :
Academic Journal
Accession number :
171311668
Full Text :
https://doi.org/10.1016/j.cag.2023.06.018