Evaluating Neural Radiance Fields for 3D Plant Geometry Reconstruction in Field Conditions

被引:0
|
作者
Arshad, Muhammad Arbab [1 ]
Jubery, Talukder [2 ]
Afful, James [2 ]
Jignasu, Anushrut [2 ]
Balu, Aditya [2 ]
Ganapathysubramanian, Baskar [2 ]
Sarkar, Soumik [1 ,2 ]
Krishnamurthy, Adarsh [2 ]
机构
[1] Iowa State Univ, Dept Comp Sci, Ames, IA 50011 USA
[2] Iowa State Univ, Dept Mech Engn, Ames, IA 50011 USA
来源
PLANT PHENOMICS | 2024年 / 6卷
关键词
NERF;
D O I
10.34133/plantphenomics.0235
中图分类号
S3 [农学(农艺学)];
学科分类号
0901 ;
摘要
We evaluate different Neural Radiance Field (NeRF) techniques for the 3D reconstruction of plants in varied environments, from indoor settings to outdoor fields. Traditional methods usually fail to capture the complex geometric details of plants, which is crucial for phenotyping and breeding studies. We evaluate the reconstruction fidelity of NeRFs in 3 scenarios with increasing complexity and compare the results with the point cloud obtained using light detection and ranging as ground truth. In the most realistic field scenario, the NeRF models achieve a 74.6% F1 score after 30 min of training on the graphics processing unit, highlighting the efficacy of NeRFs for 3D reconstruction in challenging environments. Additionally, we propose an early stopping technique for NeRF training that almost halves the training time while achieving only a reduction of 7.4% in the average F1 score. This optimization process substantially enhances the speed and efficiency of 3D reconstruction using NeRFs. Our findings demonstrate the potential of NeRFs in detailed and realistic 3D plant reconstruction and suggest practical approaches for enhancing the speed and efficiency of NeRFs in the 3D reconstruction process.
引用
收藏
页数:17
相关论文
共 50 条
  • [1] 3D Reconstruction and Rendering Based on Improved Neural Radiance Field
    Wan, Xiaona
    Xu, Ziyun
    Kang, Jian
    Feng, Xiaoyi
    2024 3RD INTERNATIONAL CONFERENCE ON IMAGE PROCESSING AND MEDIA COMPUTING, ICIPMC 2024, 2024, : 120 - 126
  • [2] Transient Neural Radiance Fields for Lidar View Synthesis and 3D Reconstruction
    Malik, Anagh
    Mirdehghan, Parsa
    Nousias, Sotiris
    Kutulakos, Kiriakos N.
    Lindell, David B.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [3] Bayesian uncertainty analysis for underwater 3D reconstruction with neural radiance fields
    Lian, Haojie
    Li, Xinhao
    Qu, Yilin
    Du, Jing
    Meng, Zhuxuan
    Liu, Jie
    Chen, Leilei
    APPLIED MATHEMATICAL MODELLING, 2025, 138
  • [4] WTBNeRF: Wind Turbine Blade 3D Reconstruction by Neural Radiance Fields
    Yang, Han
    Tang, Linchuan
    Ma, Hui
    Deng, Rongfeng
    Wang, Kai
    Zhang, Hui
    PROCEEDINGS OF TEPEN 2022, 2023, 129 : 675 - 687
  • [5] Monocular thermal SLAM with neural radiance fields for 3D scene reconstruction
    Wu, Yuzhen
    Wang, Lingxue
    Zhang, Lian
    Chen, Mingkun
    Zhao, Wenqu
    Zheng, Dezhi
    Cai, Yi
    NEUROCOMPUTING, 2025, 617
  • [6] High-fidelity wheat plant reconstruction using 3D Gaussian splatting and neural radiance fields
    Stuart, Lewis A. G.
    Wells, Darren M.
    Atkinson, Jonathan A.
    Castle-Green, Simon
    Walker, Jack
    Pound, Michael P.
    GIGASCIENCE, 2025, 14
  • [7] Adapting Neural Radiance Fields (NeRF) to the 3D Scene Reconstruction Problem Under Dynamic Illumination Conditions
    Savin, V.
    Kolodiazhna, O.
    CYBERNETICS AND SYSTEMS ANALYSIS, 2023, 59 (06) : 910 - 918
  • [8] Adapting Neural Radiance Fields (NeRF) to the 3D Scene Reconstruction Problem Under Dynamic Illumination Conditions
    V. Savin
    O. Kolodiazhna
    Cybernetics and Systems Analysis, 2023, 59 : 910 - 918
  • [9] Estimating 3D Uncertainty Field: Quantifying Uncertainty for Neural Radiance Fields
    Shen, Jianxiong
    Ren, Ruijie
    Ruiz, Adria
    Moreno-Noguer, Francesc
    2024 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA 2024, 2024, : 2375 - 2381
  • [10] Fast 3D Reconstruction of UAV Images Based on Neural Radiance Field
    Jiang, Cancheng
    Shao, Hua
    APPLIED SCIENCES-BASEL, 2023, 13 (18):