Automatic Segmentation of Tree Structure From Point Cloud Data

被引:18
作者
Digumarti, Sundara Tejaswi [1 ,2 ]
Nieto, Juan [1 ]
Cadena, Cesar [1 ]
Siegwart, Roland [1 ]
Beardsley, Paul [2 ]
机构
[1] Swiss Fed Inst Technol, Autonomous Syst Lab, CH-8092 Zurich, Switzerland
[2] Disney Res, CH-8006 Zurich, Switzerland
关键词
Robotics in agriculture and forestry; object detection; segmentation and categorization; RGB-D perception; RECONSTRUCTION; AIRBORNE; MODELS;
D O I
10.1109/LRA.2018.2849499
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Methods for capturing and modeling vegetation, such as trees or plants, typically distinguish between two components-branch skeleton and foliage. Current methods do not provide quantitatively accurate tree structure and foliage density needed for applications such as visualization, inspection, or to estimate vegetation parameters. This letter describes an automatic method for segmenting three-dimensional point cloud data of vegetation, acquired from commodity scanners, into its two main components: branches and leaves, by using geometric features computed directly on the point cloud. In this letter, the specific type of vegetation considered is broadleaf trees. We present a data-driven approach, where a Random forest classifier is used for segmentation. In contrast to state-of-the-art methods, the point cloud is not reduced to a set of primitives such as cylinders. Instead, the algorithm works at the level of the input point cloud itself, preserving quantitative accuracy in the resulting model. Computation of typical vegetation metrics follows naturally from this model. We achieve an average classification accuracy of 91% on simulated data across three different species of broadleaf trees. Qualitative results on real data are also presented.
引用
收藏
页码:3043 / 3050
页数:8
相关论文
共 37 条
[1]   Hybrid tree reconstruction from inhomogeneous point clouds [J].
Aiteanu, Fabian ;
Klein, Reinhard .
VISUAL COMPUTER, 2014, 30 (6-8) :763-771
[2]   A Pipeline for Trunk Detection in Trellis Structured Apple Orchards [J].
Bargoti, Suchet ;
Underwood, James P. ;
Nieto, Juan I. ;
Sukkarieh, Salah .
JOURNAL OF FIELD ROBOTICS, 2015, 32 (08) :1075-1094
[3]   A model for deriving voxel-level tree leaf area density estimates from ground-based LiDAR [J].
Beland, Martin ;
Widlowski, Jean-Luc ;
Fournier, Richard A. .
ENVIRONMENTAL MODELLING & SOFTWARE, 2014, 51 :184-189
[4]  
Binney Jonathan, 2009, 2009 IEEE International Conference on Robotics and Automation (ICRA), P1321, DOI 10.1109/ROBOT.2009.5152684
[5]  
BRADLEY D, 2013, ACM T GRAPHIC, V32
[6]   Editable Parametric Dense Foliage from 3D Capture [J].
Chaurasia, Gaurav ;
Beardsley, Paul .
2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, :5315-5324
[7]  
Deussen Oliver., 2006, Digital Design of Nature: Computer Generated Plants and Organics
[8]  
E Games, 2018, UNR ENG 4
[9]   Vision meets robotics: The KITTI dataset [J].
Geiger, A. ;
Lenz, P. ;
Stiller, C. ;
Urtasun, R. .
INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2013, 32 (11) :1231-1237
[10]  
Hosoi F., 2009, P ISPRS WORKSHOP LAS, V9, P152