Human Recognition and Tracking in Narrow Indoor Environment using 3D Lidar Sensor

被引:0
作者
Yoon, Jae-Seong [1 ]
Bae, Sang-Hyeon [1 ]
Kuc, Tae-yong [1 ]
机构
[1] Sungkyunkwan Univ, Dept Elect & Comp Engn, Suwon 16419, South Korea
来源
2020 20TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS (ICCAS) | 2020年
关键词
Human Recognition; Mobile Robot; Point Cloud; Support Vector Machine; Indoor Environment;
D O I
10.23919/iccas50221.2020.9268208
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper studies the human recognition, tracking, and clustering method in an indoor environment using a 3D lidar sensor and discusses two major issues in clustering. The first problem is when the Euclidean distance-based clustering is used, where a wall and a person are frequently clustered into one object. The other issue is that there is some noise due to reflective materials such as glass or marble. In order to cluster objects and recognize humans in this environment, we proposed a pre-processing sequence module for clustering. The pre-processing module composed in 5 steps that can remove walls around the robot and reduce the point cloud noise. We embedded this whole process in the robot system and it works while the robot is in motion.
引用
收藏
页码:978 / 981
页数:4
相关论文
共 8 条
[1]  
[Anonymous], 2011, Acm T. Intel. Syst. Tec., DOI DOI 10.1145/1961189.1961199
[2]  
Bradski G., 2008, Learning OpenCV
[3]  
Häselich M, 2014, IEEE INT C INT ROBOT, P4118, DOI 10.1109/IROS.2014.6943142
[4]  
Kidono K, 2011, IEEE INT VEH SYM, P405, DOI 10.1109/IVS.2011.5940433
[5]  
Lin TC, 2018, IEEE IMAGE PROC, P1922, DOI 10.1109/ICIP.2018.8451578
[6]   Pedestrian Detection and Tracking Using Three-dimensional LADAR Data [J].
Navarro-Serment, Luis E. ;
Mertz, Christoph ;
Hebert, Martial .
INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2010, 29 (12) :1516-1528
[7]  
Rusu R. B., 2011, P 2011 IEEE INT C RO, P1
[8]   Pedestrian recognition and tracking using 3D LiDAR for autonomous vehicle [J].
Wang, Heng ;
Wang, Bin ;
Liu, Bingbing ;
Meng, Xiaoli ;
Yang, Guanghong .
ROBOTICS AND AUTONOMOUS SYSTEMS, 2017, 88 :71-78