Visualization and noise reduction algorithm based on event camera

被引:0
作者
Yan C. [1 ]
Wang X. [1 ]
Zuo Y. [1 ]
Li L. [2 ]
Chen J. [2 ]
机构
[1] School of Optics and Photonics, Beijing Institute of Technology, Beijing
[2] School of Automation, Beijing Institute of Technology, Beijing
来源
Beijing Hangkong Hangtian Daxue Xuebao/Journal of Beijing University of Aeronautics and Astronautics | 2021年 / 47卷 / 02期
基金
中国国家自然科学基金;
关键词
Event camera; Noise reduction; Robustness; Visual navigation; Visualization;
D O I
10.13700/j.bh.1001-5965.2020.0192
中图分类号
学科分类号
摘要
To overcome the problem that the asynchronous event stream generated by the event camera is hard to observe, utilize and there is a lot of noise, we introduce an improved visualization and noise reduction algorithm for the event camera. Because the event stream reacts to the object movement, the proposed algorithm gets valid events by filtering the noise with the time and space continuity of moving edge. To easily observe and apply, events are accumulated with a double limitation of the events number and the time threshold. In the real dataset experiment, the noise reduction algorithm can effectively deal with the background activity noise and save the detail edge information when the movement begins or moves slowly, increasing the number of corner detections. The visualization algorithm reduces the variance of events number while ensuring the frame rate, and improves the information uniformity of the "event frame". The experimental results show the effectiveness of the proposed method in terms of noise reduction and visualization. © 2021, Editorial Board of JBUAA. All right reserved.
引用
收藏
页码:342 / 350
页数:8
相关论文
共 18 条
[1]  
SANG Y S, LI R H, LI Y Q, Et al., Research on neuromorphic vision sensor and its applications, Chinese Journal on Internet of Things, 3, 4, pp. 63-71, (2019)
[2]  
GALLEGO G, DELBRUCK T, ORCHARD G, Et al., Event-based vision: A survey, IEEE Transactions on Pattern Analysis and Machine Intelligence, 99, pp. 1-1, (2020)
[3]  
WANG T T, CAI Z H, WANG Y X., Integrated vision/inertial navigation method of UAVs in indoor environment, Journal of Beijing University of Aeronautics and Astronautics, 44, 1, pp. 176-186, (2018)
[4]  
VIDAL A R, REBECQ H, HORSTSCHAEFER T, Et al., Ultimate SLAM? Combining events, images, and IMU for robust visual SLAM in HDR and high-speed scenarios, IEEE Robotics and Automation Letters, 3, 2, pp. 994-1001, (2018)
[5]  
MUEGGLER E, GALLEGO G, REBECQ H, Et al., Continuous-time visual-inertial odometry for event cameras, IEEE Transactions on Robotics, 34, 6, pp. 1425-1440, (2018)
[6]  
MA Y Y, YE Z H, LIU K H, Et al., Event-based visual localization and mapping algorithms: A survey, Acta Automatica Sinica, 46, pp. 1-11, (2020)
[7]  
XIE X M, DU J, SHI G M, Et al., An improved approach for visualizing dynamic vision sensor and its video denoising, Proceedings of the International Conference on Video and Image Processing, pp. 176-180, (2017)
[8]  
DELBRUCK T., Frame-free dynamic digital vision, Proceedings of International Symposium on Secure-Life Electronics, Advanced Electronics for Quality Life and Society, pp. 21-26, (2008)
[9]  
FENG Y, LV H Y, LIU H L, Et al., Event density based denoising method for dynamic vision sensor, Applied Sciences, 10, 6, (2020)
[10]  
HUANG J, GUO M H, CHEN S S., A dynamic vision sensor with direct logarithmic output and full-frame picture-on-demand, Proceedings of the 2017 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 1-4, (2017)