Experimental methods to predict dynamic vision sensor event camera performance

被引:10
作者
McReynolds, Brian [1 ]
Graca, Rui [1 ]
Delbruck, Tobi [1 ]
机构
[1] Inst Neuroinformat, Sensors Grp, Zurich, Switzerland
基金
瑞士国家科学基金会;
关键词
neuromorphic cameras; dynamic vision sensors; event cameras; sensor characterization; PIXEL; TRACKING;
D O I
10.1117/1.OE.61.7.074103
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
Dynamic vision sensors (DVS) represent a promising new technology, offering low power consumption, sparse output, high temporal resolution, and wide dynamic range. These features make DVS attractive for new research areas including scientific and space-based applications; however, more precise understanding of how sensor input maps to output under real-world constraints is needed. Often, metrics used to characterize DVS report baseline performance by measuring observable limits but fail to characterize the physical processes at the root of those limits. To address this limitation, we describe step-by-step procedures to measure three important performance parameters: (1) temporal contrast threshold, (2) cutoff frequency, and (3) refractory period. Each procedure draws inspiration from previous work, but links measurements sequentially to infer physical phenomena at the root of measured behavior. Results are reported over a range of brightness levels and user-defined biases. The threshold measurement technique is validated with test-pixel node voltages, and a first-order low-pass approximation of photoreceptor response is shown to predict event cutoff temporal frequency to within 9% accuracy. The proposed method generates lab-measured parameters compatible with the event camera simulator v2e, allowing more accurate generation of synthetic datasets for innovative applications. (c) 2022 Society of Photo-Optical Instrumentation Engineers (SPIE)
引用
收藏
页数:19
相关论文
共 49 条
  • [1] Event-Based Object Detection and Tracking for Space Situational Awareness
    Afshar, Saeed
    Nicholson, Andrew Peter
    van Schaik, Andre
    Cohen, Gregory
    [J]. IEEE SENSORS JOURNAL, 2020, 20 (24) : 15117 - 15132
  • [2] [Anonymous], 2016, ANALOG DIGITAL IMPLE
  • [3] Event Probability Mask (EPM) and Event Denoising Convolutional Neural Network (EDnCNN) for Neuromorphic Cameras
    Baldwin, R. Wes
    Almatrafi, Mohammed
    Asari, Vijayan
    Hirakawa, Keigo
    [J]. 2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, : 1698 - 1707
  • [4] Berner Raphael., 2011, Building-blocks for event-based vision sensors
  • [5] Effects of Cooling on the SNR and Contrast Detection of a Low-Light Event-Based Camera
    Berthelon, Xavier
    Chenegros, Guillaume
    Finateu, Thomas
    Ieng, Sio-Hoi
    Benosman, Ryad
    [J]. IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS, 2018, 12 (06) : 1467 - 1474
  • [6] A 240 x 180 130 dB 3 μs Latency Global Shutter Spatiotemporal Vision Sensor
    Brandli, Christian
    Berner, Raphael
    Yang, Minhao
    Liu, Shih-Chii
    Delbruck, Tobi
    [J]. IEEE JOURNAL OF SOLID-STATE CIRCUITS, 2014, 49 (10) : 2333 - 2341
  • [7] Chen HS, 2020, AAAI CONF ARTIF INTE, V34, P10534
  • [8] Event-based Sensing for Space Situational Awareness
    Cohen, Gregory
    Afshar, Saeed
    Morreale, Brittany
    Bessell, Travis
    Wabnitz, Andrew
    Rutten, Mark
    van Schaik, Andre
    [J]. JOURNAL OF THE ASTRONAUTICAL SCIENCES, 2019, 66 (02) : 125 - 141
  • [9] Czech D, 2016, P IEEE RAS-EMBS INT, P19, DOI 10.1109/BIOROB.2016.7523452
  • [10] Delbruck T., GITHUB