Corner location and recognition of single ArUco marker under occlusion based on YOLO algorithm

被引:11
作者
Li, Boxuan [1 ]
Wang, Benfei [1 ]
Tan, Xiaojun [1 ,2 ]
Wu, Jiezhang [1 ]
Wei, Liangliang [1 ]
机构
[1] Sun Yat Sen Univ, Sch Intelligent Syst Engn, Guangzhou, Peoples R China
[2] Southern Marine Sci & Engn Guangdong Lab Zhuhai, Zhuhai, Peoples R China
关键词
fiducial marker; convolutional neural network; objection detection; occlusion; FIDUCIAL MARKERS;
D O I
10.1117/1.JEI.30.3.033012
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The ArUco marker is one of the most popular squared fiducial markers using for precise location acquisition during autonomous unmanned aerial vehicle (UAV) landings. This paper presents a novel method to detect, recognize, and extract the location points of single ArUco marker based on convolutional neural networks (CNN). YOLOv3 and YOLOv4 networks are applied for end-to-end detection and recognition of ArUco markers under occlusion. A custom lightweight network is employed to increase the processing speed. The bounding box regression mechanism of the YOLO algorithm is modified to locate four corners of each ArUco marker and classify markers irrespective of the orientation. The deep-learning method achieves a high mean average precision exceeding 0.9 in the coverless test set and over 0.4 under corner coverage, whereas traditional algorithm fails under the occlusion condition. The custom lightweight network notably speeds up the prediction process with acceptable decline in performance. The proposed bounding box regression mechanism can locate marker corners with less than 3% average distance error for each corner without coverage and less than 8% average distance error under corner occlusion. (c) 2021 SPIE and IS&T [DOI: 10.1117/1.JEI.30.3.033012]
引用
收藏
页数:19
相关论文
共 33 条
[1]   STag: A stable fiducial marker system [J].
Benligiray, Burak ;
Topal, Cihan ;
Akinlar, Cuneyt .
IMAGE AND VISION COMPUTING, 2019, 89 :158-169
[2]  
Bergamasco F, 2011, PROC CVPR IEEE, P113, DOI 10.1109/CVPR.2011.5995544
[3]   Vision-based autonomous tracking and landing of a fully-actuated rotorcraft [J].
Bhargavapuri, Mahathi ;
Shastry, Animesh Kumar ;
Sinha, Harsh ;
Sahoo, Soumya Ranjan ;
Kothari, Mangal .
CONTROL ENGINEERING PRACTICE, 2019, 89 :113-129
[4]  
Blanger L, 2019, IEEE IMAGE PROC, P1625, DOI [10.1109/icip.2019.8803075, 10.1109/ICIP.2019.8803075]
[5]  
Bochkovskiy A., 2020, arXiv pre-print server, DOI DOI 10.48550/ARXIV.2004.10934
[6]  
Choudhary N, 2015, 2015 INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING AND COMMUNICATION ENGINEERING SYSTEMS (SPACES), P1, DOI 10.1109/SPACES.2015.7058198
[7]  
Fiala M, 2005, PROC CVPR IEEE, P590
[8]   Automatic generation and detection of highly reliable fiducial markers under occlusion [J].
Garrido-Jurado, S. ;
Munoz-Salinas, R. ;
Madrid-Cuevas, F. J. ;
Marin-Jimenez, M. J. .
PATTERN RECOGNITION, 2014, 47 (06) :2280-2292
[9]  
Gubbi J, 2017, PROCEEDINGS OF THE FIFTEENTH IAPR INTERNATIONAL CONFERENCE ON MACHINE VISION APPLICATIONS - MVA2017, P460, DOI 10.23919/MVA.2017.7986900
[10]   Survey of Important Issues in UAV Communication Networks [J].
Gupta, Lav ;
Jain, Raj ;
Vaszkun, Gabor .
IEEE COMMUNICATIONS SURVEYS AND TUTORIALS, 2016, 18 (02) :1123-1152