Back to Search Start Over

FINER++: Building a Family of Variable-periodic Functions for Activating Implicit Neural Representation

Authors :
Zhu, Hao
Liu, Zhen
Zhang, Qi
Fu, Jingde
Deng, Weibing
Ma, Zhan
Guo, Yanwen
Cao, Xun
Publication Year :
2024

Abstract

Implicit Neural Representation (INR), which utilizes a neural network to map coordinate inputs to corresponding attributes, is causing a revolution in the field of signal processing. However, current INR techniques suffer from the "frequency"-specified spectral bias and capacity-convergence gap, resulting in imperfect performance when representing complex signals with multiple "frequencies". We have identified that both of these two characteristics could be handled by increasing the utilization of definition domain in current activation functions, for which we propose the FINER++ framework by extending existing periodic/non-periodic activation functions to variable-periodic ones. By initializing the bias of the neural network with different ranges, sub-functions with various frequencies in the variable-periodic function are selected for activation. Consequently, the supported frequency set can be flexibly tuned, leading to improved performance in signal representation. We demonstrate the generalization and capabilities of FINER++ with different activation function backbones (Sine, Gauss. and Wavelet) and various tasks (2D image fitting, 3D signed distance field representation, 5D neural radiance fields optimization and streamable INR transmission), and we show that it improves existing INRs. Project page: {https://liuzhen0212.github.io/finerpp/}<br />Comment: Extension of previous CVPR paper "FINER: Flexible spectral-bias tuning in implicit neural representation by variable-periodic activation functions". arXiv admin note: substantial text overlap with arXiv:2312.02434

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2407.19434
Document Type :
Working Paper