ELiSeD - An Event-Based Line Segment Detector

被引:27
作者
Brandli, Christian [1 ]
Strubel, Jonas [1 ]
Keller, Susanne [1 ]
Scaramuzza, Davide [2 ]
Delbruck, Tobi [1 ]
机构
[1] Univ Zurich, Inst Neuroinformat, Winterthurerstr 190, Zurich, Switzerland
[2] Univ Zurich, Robot & Percept Grp, Andreasstr 15m, Zurich, Switzerland
来源
2016 2ND INTERNATIONAL CONFERENCE ON EVENT-BASED CONTROL, COMMUNICATION, AND SIGNAL PROCESSING (EBCCSP) | 2016年
关键词
event-based; computer vision; machine vision; line segment detector; visual feature; DVS; DAVIS; silicon retina;
D O I
10.1109/EBCCSP.2016.7605244
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Event-based temporal contrast vision sensors such as the Dynamic Vison Sensor (DVS) have advantages such as high dynamic range, low latency, and low power consumption. Instead of frames, these sensors produce a stream of events that encode discrete amounts of temporal contrast. Surfaces and objects with sufficient spatial contrast trigger events if they are moving relative to the sensor, which thus performs inherent edge detection. These sensors are well-suited for motion capture, but so far suitable event-based, low-level features that allow assigning events to spatial structures have been lacking. A general solution of the so-called event correspondence problem, i.e. inferring which events are caused by the motion of the same spatial feature, would allow applying these sensors in a multitude of tasks such as visual odometry or structure from motion. The proposed Event-based Line Segment Detector (ELiSeD) is a step towards solving this problem by parameterizing the event stream as a set of line segments. The event stream which is used to update these low-level features is continuous in time and has a high temporal resolution; this allows capturing even fast motions without the requirement to solve the conventional frame-to-frame motion correspondence problem. The ELiSeD feature detector and tracker runs in real-time on a laptop computer at image speeds of up to 1300 pix/s and can continuously track rotations of up to 720 deg/s. The algorithm is open-sourced in the jAER project.
引用
收藏
页数:7
相关论文
共 23 条
[11]  
Kyoobin Lee, 2012, 2012 IEEE 1st Global Conference on Consumer Electronics (GCCE 2012), P293, DOI 10.1109/GCCE.2012.6379606
[12]  
Lagorce X., 2014, IEEE T NEURAL NETWOR, P1
[13]  
LEAVERS VF, 1993, CVGIP-IMAG UNDERSTAN, V58, P250, DOI 10.1006/ciun.1993.1041
[14]   A 128x128 120 dB 15 μs latency asynchronous temporal contrast vision sensor [J].
Lichtsteiner, Patrick ;
Posch, Christoph ;
Delbruck, Tobi .
IEEE JOURNAL OF SOLID-STATE CIRCUITS, 2008, 43 (02) :566-576
[15]  
Lilian Zhang, 2011, 2011 Irish Machine Vision and Image Processing Conference, P7, DOI 10.1109/IMVIP.2011.11
[16]  
Liu SC, 2015, EVENT-BASED NEUROMORPHIC SYSTEMS, P1, DOI 10.1002/9781118927601
[17]  
Mueggler E., 2015, INT C ROB AUT ICRA S
[18]   Asynchronous Event-Based Visual Shape Tracking for Stable Haptic Feedback in Microrobotics [J].
Ni, Zhenjiang ;
Bolopion, Aude ;
Agnus, Joel ;
Benosman, Ryad ;
Regnier, Stephane .
IEEE TRANSACTIONS ON ROBOTICS, 2012, 28 (05) :1081-1089
[19]   Image moments-based structuring and tracking of objects [J].
Rocha, L ;
Velho, L ;
Carvalho, PCP .
SIBGRAPI 2002: XV BRAZILIAN SYMPOSIUM ON COMPUTER GRAPHICS AND IMAGE PROCESSING, PROCEEDINGS, 2002, :99-105
[20]  
Schraml S, 2010, IEEE INT SYMP CIRC S, P1408, DOI 10.1109/ISCAS.2010.5537286