A Review on Deep Learning Techniques for 3D Sensed Data Classification

被引:124
作者
Griffiths, David [1 ]
Boehm, Jan [1 ]
机构
[1] UCL, Dept Civil Environm & Geomat Engn, Gower St, London WC1E 6BT, England
关键词
point cloud; deep learning; classification; semantics; segmentation; machine learning; OBJECT RECOGNITION; POINT CLOUD; CONTEXTUAL CLASSIFICATION; LIDAR DATA; SEGMENTATION; EXTRACTION; MULTIVIEW; FEATURES; IMAGES; TREES;
D O I
10.3390/rs11121499
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
Over the past decade deep learning has driven progress in 2D image understanding. Despite these advancements, techniques for automatic 3D sensed data understanding, such as point clouds, is comparatively immature. However, with a range of important applications from indoor robotics navigation to national scale remote sensing there is a high demand for algorithms that can learn to automatically understand and classify 3D sensed data. In this paper we review the current state-of-the-art deep learning architectures for processing unstructured Euclidean data. We begin by addressing the background concepts and traditional methodologies. We review the current main approaches, including RGB-D, multi-view, volumetric and fully end-to-end architecture designs. Datasets for each category are documented and explained. Finally, we give a detailed discussion about the future of deep learning for 3D sensed data, using literature to justify the areas where future research would be most valuable.
引用
收藏
页数:29
相关论文
共 117 条
[1]  
Achlioptas P., 2017, ARXIV PREPRINT ARXIV
[2]   Fast High-Dimensional Filtering Using the Permutohedral Lattice [J].
Adams, Andrew ;
Baek, Jongmin ;
Davis, Myers Abraham .
COMPUTER GRAPHICS FORUM, 2010, 29 (02) :753-762
[3]  
Ahmed E., 2018, ARXIV180801462
[4]   Segmentation Based Classification of 3D Urban Point Clouds: A Super-Voxel Based Approach with Evaluation [J].
Aijazi, Ahmad Kamal ;
Checchin, Paul ;
Trassoudaine, Laurent .
REMOTE SENSING, 2013, 5 (04) :1624-1650
[5]  
[Anonymous], 2016, ARXIV160308695
[6]  
[Anonymous], 2015, P COMP VIS PATT REC
[7]  
[Anonymous], 2015, Nature, DOI [10.1038/nature14539, DOI 10.1038/NATURE14539]
[8]  
[Anonymous], 2017, COMMUN ACM, DOI DOI 10.1145/3065386
[9]  
[Anonymous], 2017, CVPR
[10]  
[Anonymous], 2017, ARXIV171109869