Seeing through Events: Real-Time Moving Object Sonification for Visually Impaired People Using Event-Based Camera

被引:10
|
作者
Ji, Zihao [1 ]
Hu, Weijian [1 ]
Wang, Ze [1 ]
Yang, Kailun [2 ]
Wang, Kaiwei [1 ]
机构
[1] Zhejiang Univ, Natl Engn Res Ctr Opt Instrumentat, Hangzhou 310058, Peoples R China
[2] Karlsruhe Inst Technol, Inst Anthropomat & Robot, D-76131 Karlsruhe, Germany
关键词
event-based camera; computer vision for visually impaired people; sonification; unsupervised object tracking; TRACKING; SYSTEM;
D O I
10.3390/s21103558
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Scene sonification is a powerful technique to help Visually Impaired People (VIP) understand their surroundings. Existing methods usually perform sonification on the entire images of the surrounding scene acquired by a standard camera or on the priori static obstacles acquired by image processing algorithms on the RGB image of the surrounding scene. However, if all the information in the scene are delivered to VIP simultaneously, it will cause information redundancy. In fact, biological vision is more sensitive to moving objects in the scene than static objects, which is also the original intention of the event-based camera. In this paper, we propose a real-time sonification framework to help VIP understand the moving objects in the scene. First, we capture the events in the scene using an event-based camera and cluster them into multiple moving objects without relying on any prior knowledge. Then, sonification based on MIDI is enabled on these objects synchronously. Finally, we conduct comprehensive experiments on the scene video with sonification audio attended by 20 VIP and 20 Sighted People (SP). The results show that our method allows both participants to clearly distinguish the number, size, motion speed, and motion trajectories of multiple objects. The results show that our method is more comfortable to hear than existing methods in terms of aesthetics.
引用
收藏
页数:18
相关论文
共 50 条
  • [1] A Comparative Study in Real-Time Scene Sonification for Visually Impaired People
    Hu, Weijian
    Wang, Kaiwei
    Yang, Kailun
    Cheng, Ruiqi
    Ye, Yaozu
    Sun, Lei
    Xu, Zhijie
    SENSORS, 2020, 20 (11)
  • [2] Real-Time Object Recognition and Orientation Estimation Using an Event-Based Camera and CNN
    Ghosh, Rohan
    Mishra, Abhishek
    Orchard, Garrick
    Thakor, Nitish V.
    2014 IEEE BIOMEDICAL CIRCUITS AND SYSTEMS CONFERENCE (BIOCAS), 2014, : 544 - 547
  • [3] ARAware: Assisting Visually Impaired People with Real-Time Critical Moving Object Identification
    Surougi, Hadeel
    Zhao, Cong
    McCann, Julie A.
    SENSORS, 2024, 24 (13)
  • [4] Detecting Real Time Object Along with the Moving Direction for Visually Impaired People
    Zereen, Aniqua Nusrat
    Corraya, Sonia
    2016 2ND INTERNATIONAL CONFERENCE ON ELECTRICAL, COMPUTER & TELECOMMUNICATION ENGINEERING (ICECTE), 2016,
  • [5] Event-based sonification of EEG rhythms in real time
    Baier, Gerold
    Hermann, Thomas
    Stephani, Ulrich
    CLINICAL NEUROPHYSIOLOGY, 2007, 118 (06) : 1377 - 1386
  • [6] Real-Time Object Detection Application for Visually Impaired People: Third Eye
    Tosun, Selman
    Karaarslan, Enis
    2018 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND DATA PROCESSING (IDAP), 2018,
  • [7] Event-based Real-time Moving Object Detection Based On IMU Ego-motion Compensation
    Zhao, Chunhui
    Li, Yakun
    Lyu, Yang
    2023 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA, 2023, : 690 - 696
  • [8] Application of an Event-Based Camera for Real-Time Velocity Resolved Kinetics
    Golibrzuch, Kai
    Schwabe, Sven
    Zhong, Tianli
    Papendorf, Kim
    Wodtke, Alec M.
    JOURNAL OF PHYSICAL CHEMISTRY A, 2022, 126 (13): : 2142 - 2148
  • [9] A Real-Time Obstacle Detection Algorithm for the Visually Impaired Using Binocular Camera
    Zhang, Rumin
    Wang, Wenyi
    Zeng, Liaoyuan
    Chen, Jianwen
    COMMUNICATIONS, SIGNAL PROCESSING, AND SYSTEMS, 2019, 463 : 1412 - 1419
  • [10] REAL-TIME SEEING THROUGH MOVING FOG
    SCHMALFUSS, H
    OPTICS COMMUNICATIONS, 1976, 17 (03) : 245 - 246