Automatic Generation of Urban Road 3D Models for Pedestrian Studies from LiDAR Data

被引:14
作者
Fernandez-Arango, David [1 ]
Varela-Garcia, Francisco-Alberto [1 ]
Gonzalez-Aguilera, Diego [2 ]
Laguela-Lopez, Susana [2 ]
机构
[1] Univ A Coruna, Civil Engn Dept, Carto LAB, Campus Elvina, La Coruna 15071, Spain
[2] Univ Salamanca, Polytech Sch Avila, TIDOP Res Grp, Avila 05003, Spain
关键词
LiDAR point cloud; mobile LiDAR system; point cloud segmentation; urban road; urban mobility; pedestrian accessibility; LASER-SCANNING DATA; MOBILE LIDAR; EXTRACTION; CLASSIFICATION; MARKINGS; TREES;
D O I
10.3390/rs14051102
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
The point clouds acquired with a mobile LiDAR scanner (MLS) have high density and accuracy, which allows one to identify different elements of the road in them, as can be found in many scientific references, especially in the last decade. This study presents a methodology to characterize the urban space available for walking, by segmenting point clouds from data acquired with MLS and automatically generating impedance surfaces to be used in pedestrian accessibility studies. Common problems in the automatic segmentation of the LiDAR point cloud were corrected, achieving a very accurate segmentation of the points belonging to the ground. In addition, problems caused by occlusions caused mainly by parked vehicles and that prevent the availability of LiDAR points in spaces normally intended for pedestrian circulation, such as sidewalks, were solved in the proposed methodology. The innovation of this method lies, therefore, in the high definition of the generated 3D model of the pedestrian space to model pedestrian mobility, which allowed us to apply it in the search for shorter and safer pedestrian paths between the homes and schools of students in urban areas within the Big-Geomove project. Both the developed algorithms and the LiDAR data used are freely licensed for their use in further research.
引用
收藏
页数:23
相关论文
共 58 条
[1]   Automatic classification of urban ground elements from mobile laser scanning data [J].
Balado, J. ;
Diaz-Vilarino, L. ;
Arias, P. ;
Gonzalez-Jorge, H. .
AUTOMATION IN CONSTRUCTION, 2018, 86 :226-239
[2]  
Bayerl S.F., 2014, P 17 INT IEEE C INT
[3]   How far can we trust forestry estimates from low-density LiDAR acquisitions? The Cutfoot Sioux experimental forest (MN, USA) case study [J].
Borgogno Mondino, Enrico ;
Fissore, Vanina ;
Falkowski, Michael J. ;
Palik, Brian .
INTERNATIONAL JOURNAL OF REMOTE SENSING, 2020, 41 (12) :4549-4567
[4]   Nature and mental health: An ecosystem service perspective [J].
Bratman, Gregory N. ;
Anderson, Christopher B. ;
Berman, Marc G. ;
Cochran, Bobby ;
de Vries, Sjerp ;
Flanders, Jon ;
Folke, Carl ;
Frumkin, Howard ;
Gross, James J. ;
Hartig, Terry ;
Kahn, Peter H., Jr. ;
Kuo, Ming ;
Lawler, Joshua J. ;
Levin, Phillip S. ;
Lindahl, Therese ;
Meyer-Lindenberg, Andreas ;
Mitchell, Richard ;
Ouyang, Zhiyun ;
Roe, Jenny ;
Scarlett, Lynn ;
Smith, Jeffrey R. ;
van den Bosch, Matilda ;
Wheeler, Benedict W. ;
White, Mathew P. ;
Zheng, Hua ;
Daily, Gretchen C. .
SCIENCE ADVANCES, 2019, 5 (07)
[5]   Toward Accurate Road Detection in Challenging Environments Using 3D Point Clouds [J].
Byun, Jaemin ;
Seo, Beom-Su ;
Lee, Jihong .
ETRI JOURNAL, 2015, 37 (03) :606-616
[6]   Upward-fusion urban DTM generating method using airborne Lidar data [J].
Chen, Ziyue ;
Devereux, Bernard ;
Gao, Bingbo ;
Amable, Gabriel .
ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, 2012, 72 :121-130
[7]   Mobile Light Detection and Ranging for Automated Pavement Friction Estimation [J].
Du, Yuchuan ;
Li, Yishun ;
Jiang, Shengchuan ;
Shen, Yu .
TRANSPORTATION RESEARCH RECORD, 2019, 2673 (10) :663-672
[8]   Walking, places and wellbeing [J].
Ettema, Dick ;
Smajic, Ifeta .
GEOGRAPHICAL JOURNAL, 2015, 181 (02) :102-109
[9]  
Feng HF, 2022, IEEE T INTELL TRANSP, V23, P11052, DOI [10.1145/3502871.3502872, 10.1109/TITS.2021.3099023]
[10]   ENHANCING THE RESOLUTION OF URBAN DIGITAL TERRAIN MODELS USING MOBILE MAPPING SYSTEMS [J].
Feng, Y. ;
Brenner, C. ;
Sester, M. .
13TH 3D GEOINFO CONFERENCE 2018, 2018, 4-4 (W6) :11-18