Deep Learning for UAV Detection and Classification via Radio Frequency Signal Analysis

被引:2
作者
Podder, Prajoy [1 ]
Zawodniok, Maciej [1 ]
Madria, Sanjay [2 ]
机构
[1] Missouri Univ Sci & Technol, Dept Elect & Comp Engn, Rolla, MO 65409 USA
[2] Missouri Univ Sci & Technol, Dept Comp Sci, Rolla, MO USA
来源
PROCEEDINGS OF THE 2024 25TH IEEE INTERNATIONAL CONFERENCE ON MOBILE DATA MANAGEMENT, MDM 2024 | 2024年
关键词
Radio Frequency (RF) signal; Deep Neural Network (DNN); Convolutional Neural Network (CNN); unmanned aerial vehicles (UAV); DRONE DETECTION; TECHNOLOGIES; SYSTEM;
D O I
10.1109/MDM61037.2024.00040
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Unmanned Aerial Vehicles (UAVs) are advertised as great tool that benefits society and humanity. However, UAVs also pose significant security threats ranging from privacy invasions, to interfering with commercial aircraft landing and takeoff, to accidently crashing into vehicles or people, to military or terrorist attacks. Consequently, there is a pressing need to detect and identify UAVs to mitigate such potential risks. While image-based methods are crucial for UAV detection, radio frequency (RF) emissions offer additional valuable insights. Analyzing RF signals, such as those used in UAV-ground station communications, can provide information about UAV types based on distinct frequency usage or communication patterns. This work introduces a deep-learning-based approach for recognizing and identifying UAVs using their RF emissions. Captured RI' signals are transformed into spectrograms, which are subsequently analyzed using deep neural networks. Existing methods achieve low identification accuracy, for instance the ResNet-50V2 model achieves an accuracy of 85.39% even in controlled, laboratory, noise-free conditions. Moreover, in outdoor environments at distances of 50m and 1.00m, the accuracy drops to 68.90% and 56.88%, respectively. To improve classification accuracy in outdoors, a CNN model was developed, yielding an accuracy of 78.12%. Leveraging the ResNet 50 V2 architecture, remarkable accuracy of 95.08% was attained in binary classification tasks involving a dataset comprising 195 mixed UAV images and 290 non-mix UAV images.
引用
收藏
页码:165 / 174
页数:10
相关论文
共 37 条
[1]   Using Deep Networks for Drone Detection [J].
Aker, Cemal ;
Kalkan, Sinan .
2017 14TH IEEE INTERNATIONAL CONFERENCE ON ADVANCED VIDEO AND SIGNAL BASED SURVEILLANCE (AVSS), 2017,
[2]   RF-based drone detection and identification using deep learning approaches: An initiative towards a large open source drone database [J].
Al-Sa'd, Mohammad F. ;
Al-Ali, Abdulla ;
Mohamed, Amr ;
Khattab, Tamer ;
Erbad, Aiman .
FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2019, 100 :86-97
[3]   DroneRF dataset: A dataset of drones for RF-based detection, classification and identification [J].
Allahham, M. H. D. Saria ;
Al-Sa'd, Mohammad F. ;
Al-Ali, Abdulla ;
Mohamed, Amr ;
Khattab, Tamer ;
Erbad, Aiman .
DATA IN BRIEF, 2019, 26
[4]  
[Anonymous], About us
[5]   Drone classification from RF fingerprints using deep residual nets [J].
Basak, Sanjoy ;
Rajendran, Sreeraj ;
Pollin, Sofie ;
Scheers, Bart .
2021 INTERNATIONAL CONFERENCE ON COMMUNICATION SYSTEMS & NETWORKS (COMSNETS), 2021, :548-555
[6]   A Deep Learning Approach for Vital Signs Compression and Energy Efficient Delivery in mhealth Systems [J].
Ben Said, Ahmed ;
Al-Sa'd, Mohammad Fathi ;
Tlili, Mounira ;
Abdellatif, Alaa Awad ;
Mohamed, Amr ;
Elfouly, Tarek ;
Harras, Khaled ;
O'Connor, Mark Dennis .
IEEE ACCESS, 2018, 6 :33727-33739
[7]  
Chan W, 2016, INT CONF ACOUST SPEE, P4960, DOI 10.1109/ICASSP.2016.7472621
[8]  
Chow J.Y.J., 2016, INT J TRANSPORTATION, V5, P167, DOI DOI 10.1016/J.IJTST.2016.11.002
[9]   Efficient Forest Fire Detection Index for Application in Unmanned Aerial Systems (UASs) [J].
Cruz, Henry ;
Eckert, Martina ;
Meneses, Juan ;
Martinez, Jose-Fernan .
SENSORS, 2016, 16 (06)
[10]   An Amateur Drone Surveillance System Based on the Cognitive Internet of Things [J].
Ding, Guoru ;
Wu, Qihui ;
Zhang, Linyuan ;
Lin, Yun ;
Tsiftsis, Theodoros A. ;
Yao, Yu-Dong .
IEEE COMMUNICATIONS MAGAZINE, 2018, 56 (01) :29-35