Back to Search Start Over

Evaluating the Point Cloud of Individual Trees Generated from Images Based on Neural Radiance Fields (NeRF) Method.

Authors :
Huang, Hongyu
Tian, Guoji
Chen, Chongcheng
Source :
Remote Sensing; Mar2024, Vol. 16 Issue 6, p967, 17p
Publication Year :
2024

Abstract

Three-dimensional (3D) reconstruction of trees has always been a key task in precision forestry management and research. Due to the complex branch morphological structure of trees themselves and the occlusions from tree stems, branches and foliage, it is difficult to recreate a complete three-dimensional tree model from a two-dimensional image by conventional photogrammetric methods. In this study, based on tree images collected by various cameras in different ways, the Neural Radiance Fields (NeRF) method was used for individual tree dense reconstruction and the exported point cloud models are compared with point clouds derived from photogrammetric reconstruction and laser scanning methods. The results show that the NeRF method performs well in individual tree 3D reconstruction, as it has a higher successful reconstruction rate, better reconstruction in the canopy area and requires less images as input. Compared with the photogrammetric dense reconstruction method, NeRF has significant advantages in reconstruction efficiency and is adaptable to complex scenes, but the generated point cloud tend to be noisy and of low resolution. The accuracy of tree structural parameters (tree height and diameter at breast height) extracted from the photogrammetric point cloud is still higher than those derived from the NeRF point cloud. The results of this study illustrate the great potential of the NeRF method for individual tree reconstruction, and it provides new ideas and research directions for 3D reconstruction and visualization of complex forest scenes. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20724292
Volume :
16
Issue :
6
Database :
Complementary Index
Journal :
Remote Sensing
Publication Type :
Academic Journal
Accession number :
176366538
Full Text :
https://doi.org/10.3390/rs16060967