Real-Time Drone Surveillance System for Violent Crowd Behavior Unmanned Aircraft System (UAS) - Human Autonomy Teaming (HAT)

被引:5
作者
Simpson, Todd [1 ]
机构
[1] Wright State Univ, Southwest Res Inst, Sinclair Coll, Dayton, OH 45435 USA
来源
2021 IEEE/AIAA 40TH DIGITAL AVIONICS SYSTEMS CONFERENCE (DASC) | 2021年
关键词
Unmanned Aerial Systems; UAS; UAV; Drone; Artificial Intelligence; Machine Learning; Human Autonomy Teaming; Crowd Behavior Prediction; Real-time Monitoring; Loihi; True North; Neuromorphic Computing; Crowd Control; Wright State University; Sinclair College; Human Factors;
D O I
10.1109/DASC52595.2021.9594332
中图分类号
V [航空、航天];
学科分类号
08 ; 0825 ;
摘要
Unmanned Aerial Systems (UASs), or drones, continue to increase in capabilities and sophistication across a wide range of applications. UASs have high mobility, are easily deployed, and capable of real-time monitoring of crowd behavior by utilizing multi-sensor-based detection and remote sensing of objects. These capabilities make UASs a very useful tool for Human Autonomy Teaming (HAT) applications, such as Law Enforcement (LE), capitalizing on Human Factors (HF). This study examines the concept of leveraging drone technology together with Artificial Intelligence (AI) and Machine Learning (ML) methods to produce a UAS system that can assist LE in the monitoring and assessment of crowd behaviors during peaceful and non-peaceful events. LE agencies are increasingly being tasked with engaging in dynamic environments that exist at public events. Utilized as a force multiplier and autonomous tool, would benefit from an AI-UAS platform utilizing artificial intelligence assisting in identifying behavior of peaceful people as opposed to malevolent participants or instigators that may attempt to take control. AI-UASs of this type would allow LE to leverage existing resources within their organizational structures and provide increased situation awareness via a Live Virtual Constructive (LVC) broadcast and monitoring of these dynamic environments. Information provided from these AI-UAS systems would provide real-time information to field forces as well as command and control operations that may be remotely located. AI-UAS Sensors can be dynamically allocated as needed for monitoring/documenting crowd behavior and police actions. Video recordings would provide evidence in court as well counter truth-bending recordings published by professional protestors and agenda driven main-stream media outlets. The benefits and impact of this type of LE AI-UAS platform would be profound. Traditional visible light based sensors can be greatly influenced by environmental factors preventing their ability to determine variations regarding abnormal crowd behaviors. In order to overcome this challenge, this project proposes to utilize four types of collection methods, Multitask Cascading CNN (MC-CNN), ScatterNet Hybrid Deep Learning Network, multiscale infrared optical flow (MIR-OF), and Event Cameras such as Event-based Vision, and Event Camera SLAM (Simultaneous Localization and Mapping). AI methods will be developed to monitor crowd density, average ground speed, human pose estimations, and movement behaviors, as well as identification of primary violent instigators. This proposed system will detect violent individuals in real-time by leveraging onboard image processing as well as cloud processing. Fundamental research for this project is inspired and built upon recent Drone Surveillance System (DSS) publications from IEEE and MDPI.
引用
收藏
页数:9
相关论文
共 31 条
[1]  
[Anonymous], 2021, OP SOURC AUT DRON
[2]  
[Anonymous], IBM RES BRAIN INSPIR
[3]  
[Anonymous], 2019, MEDIUM 0716
[4]  
[Anonymous], 2019, EV BAS VIS EV CAM EV
[5]  
[Anonymous], 2021, NEUR COMP NEXT GEN A
[6]  
[Anonymous], 2016, Neuromorphic Computing: From Materials to Systems Architecture
[7]  
[Anonymous], 2021, NVIDIA EMB SYST NEXT
[8]  
[Anonymous], INTEL DEV ZONE
[9]  
[Anonymous], 2021, EV BAS EV KIT 2 HD
[10]  
[Anonymous], Welcome to skycatch