Single-Camera-Based 3D Drone Trajectory Reconstruction for Surveillance Systems

被引:0
作者
Hwang, Seo-Bin [1 ]
Cho, Yeong-Jun [1 ]
机构
[1] Chonnam Natl Univ, Dept Artificial Intelligence Convergence, Gwangju 61186, South Korea
来源
IEEE ACCESS | 2025年 / 13卷
基金
新加坡国家研究基金会;
关键词
Drones; Three-dimensional displays; Trajectory; Cameras; Image reconstruction; Surveillance; Sensors; Radar tracking; Accuracy; Estimation; Drone; 3D trajectory reconstruction; tracking; single camera; surveillance system; TRACKING;
D O I
10.1109/ACCESS.2025.3555321
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Drones have been utilized in various fields, but the number of drones being used illegally and for illegal and hazardous purposes has recently increased. These misuses have given rise to the development of an anti-drone system, C-UAS, comprising two key steps: engagement and detection. This detection step plays a crucial role in reconstructing the drone's trajectory for the subsequent engagement step. Therefore, we focus on a trajectory reconstruction approach that utilizes external data, such as CCTV view images, instead of internal drone data. While numerous methods have been explored for estimating 3D drone trajectory using multiple sensors, they are often unsuitable for surveillance systems. In this study, we use a calibrated single camera suitable for surveillance systems, leveraging the relationship between 2D and 3D spaces. We use a drone tracker to automatically track 2D images. The tracked images are used to estimate the 2D rotation of the drone in the image through principal component analysis (PCA). By combining the estimated 2D drone positions with the actual length, we geometrically infer the 3D drone trajectories. We also develop synthetic 2D and 3D drone datasets to address the lack of public drone datasets. Additionally, a real-world 3D dataset is generated and made available for public use. The experimental results demonstrate that the proposed method can accurately reconstruct drone trajectories in 3D space. With a MAE of 5.66 and RMSE of 7.93 showing low error rates, these findings validate the practical value and potential of our framework as a single-camera-based surveillance system.
引用
收藏
页码:56413 / 56427
页数:15
相关论文
共 43 条
[1]  
Abdelkrim Nemra, 2008, 16th Mediterranean Conference on Control & Automation, MED 2008, P695, DOI 10.1109/MED.2008.4602149
[2]  
[Anonymous], 2019, US
[3]   Evaluating Multiple Object Tracking Performance: The CLEAR MOT Metrics [J].
Bernardin, Keni ;
Stiefelhagen, Rainer .
EURASIP JOURNAL ON IMAGE AND VIDEO PROCESSING, 2008, 2008 (1)
[4]  
Block Lukas, 2022, Procedia CIRP, P434, DOI 10.1016/j.procir.2022.05.004
[5]   Root mean square error (RMSE) or mean absolute error (MAE)? - Arguments against avoiding RMSE in the literature [J].
Chai, T. ;
Draxler, R. R. .
GEOSCIENTIFIC MODEL DEVELOPMENT, 2014, 7 (03) :1247-1250
[6]   Physics-based ball tracking and 3D trajectory reconstruction with applications to shooting location estimation in basketball video [J].
Chen, Hua-Tsung ;
Tien, Ming-Chun ;
Chen, Yi-Wen ;
Tsai, Wen-Jiin ;
Lee, Suh-Yin .
JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2009, 20 (03) :204-216
[7]  
Chen HC, 2020, CHINA COMMUN, V17, P81, DOI 10.23919/JCC.2020.02.007
[8]  
Chi Guoxuan, 2022, MobiSys '22: Proceedings of the 20th Annual International Conference on Mobile Systems, Applications and Services, P56, DOI 10.1145/3498361.3538936
[9]   Drone-vs-Bird Detection Challenge at IEEE AVSS2021 [J].
Coluccia, Angelo ;
Fascista, Alessio ;
Schumann, Arne ;
Sommer, Lars ;
Dimou, Anastasios ;
Zarpalas, Dimitrios ;
Akyon, Fatih Cagatay ;
Eryuksel, Ogulcan ;
Ozfuttu, Kamil Anil ;
Altinuc, Sinan Onur ;
Dadboud, Fardad ;
Patel, Vaibhav ;
Mehta, Varun ;
Bolic, Miodrag ;
Mantegh, Iraj .
2021 17TH IEEE INTERNATIONAL CONFERENCE ON ADVANCED VIDEO AND SIGNAL BASED SURVEILLANCE (AVSS 2021), 2021,
[10]   Indoor 3D Human Trajectory Reconstruction Using Surveillance Camera Videos and Point Clouds [J].
Dai, Yudi ;
Wen, Chenglu ;
Wu, Hai ;
Guo, Yulan ;
Chen, Longbiao ;
Wang, Cheng .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (04) :2482-2495