Back to Search Start Over

Accelerating Neural Field Training via Soft Mining

Authors :
Kheradmand, Shakiba
Rebain, Daniel
Sharma, Gopal
Isack, Hossam
Kar, Abhishek
Tagliasacchi, Andrea
Yi, Kwang Moo
Publication Year :
2023

Abstract

We present an approach to accelerate Neural Field training by efficiently selecting sampling locations. While Neural Fields have recently become popular, it is often trained by uniformly sampling the training domain, or through handcrafted heuristics. We show that improved convergence and final training quality can be achieved by a soft mining technique based on importance sampling: rather than either considering or ignoring a pixel completely, we weigh the corresponding loss by a scalar. To implement our idea we use Langevin Monte-Carlo sampling. We show that by doing so, regions with higher error are being selected more frequently, leading to more than 2x improvement in convergence speed. The code and related resources for this study are publicly available at https://ubc-vision.github.io/nf-soft-mining/.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2312.00075
Document Type :
Working Paper