Device-Free Gesture Tracking Using Acoustic Signals

被引:298
|
作者
Wang, Wei [1 ]
Liu, Alex X. [1 ,2 ]
Sun, Ke [1 ]
机构
[1] Nanjing Univ, State Key Lab Novel Software Technol, Nanjing, Jiangsu, Peoples R China
[2] Michigan State Univ, Dept Comp Sci & Engn, E Lansing, MI 48824 USA
来源
MOBICOM'16: PROCEEDINGS OF THE 22ND ANNUAL INTERNATIONAL CONFERENCE ON MOBILE COMPUTING AND NETWORKING | 2016年
基金
中国国家自然科学基金; 美国国家科学基金会;
关键词
Gesture Tracking; Ultrasound; Device-free;
D O I
10.1145/2973750.2973764
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Device-free gesture tracking is an enabling HCI mechanism for small wearable devices because fingers are too big to control the GUI elements on such small screens, and it is also an important HCI mechanism for medium-to-large size mobile devices because it allows users to provide input without blocking screen view. In this paper, we propose LLAP, a device-free gesture tracking scheme that can be deployed on existing mobile devices as software, without any hardware modification. We use speakers and microphones that already exist on most mobile devices to perform device-free tracking of a hand/finger. The key idea is to use acoustic phase to get fine-grained movement direction and movement distance measurements. LLAP first extracts the sound signal reflected by the moving hand/finger after removing the background sound signals that are relatively consistent over time. LLAP then measures the phase changes of the sound signals caused by hand/finger movements and then converts the phase changes into the distance of the movement. We implemented and evaluated LLAP using commercial-off-the-shelf mobile phones. For 1-D hand movement and 2-D drawing in the air, LLAP has a tracking accuracy of 3.5 mm and 4.6 mm, respectively. Using gesture traces tracked by LLAP, we can recognize the characters and short words drawn in the air with an accuracy of 92.3% and 91.2%, respectively.
引用
收藏
页码:82 / 94
页数:13
相关论文
共 50 条
  • [1] Device-Free Gesture Tracking Using Acoustic Signals
    Wang, Wei
    Liu, Alex X.
    Sun, Ke
    MOBICOM'16: PROCEEDINGS OF THE 22ND ANNUAL INTERNATIONAL CONFERENCE ON MOBILE COMPUTING AND NETWORKING, 2016, : 497 - 498
  • [2] Device-Free Gesture Recognition Using Time Series RFID Signals
    Ding, Han
    Guo, Lei
    Zhao, Cui
    Li, Xiao
    Shi, Wei
    Zhao, Jizhong
    BROADBAND COMMUNICATIONS, NETWORKS, AND SYSTEMS, 2019, 303 : 144 - 155
  • [3] DSW: One-Shot Learning Scheme for Device-Free Acoustic Gesture Signals
    Wang, Xun
    Sun, Ke
    Zhao, Ting
    Wang, Wei
    Gu, Qing
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2023, 22 (09) : 5198 - 5215
  • [4] Multi-Person Device-Free Gesture Recognition Using mmWave Signals
    Jie Wang
    Zhouhua Ran
    Qinghua Gao
    Xiaorui Ma
    Miao Pan
    Kaiping Xue
    中国通信, 2021, 18 (02) : 186 - 199
  • [5] Practical Device-Free Gesture Recognition Using WiFi Signals Based on Metalearning
    Ma, Xiaorui
    Zhao, Yunong
    Zhang, Liang
    Gao, Qinghua
    Pan, Miao
    Wang, Jie
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2020, 16 (01) : 228 - 237
  • [6] Multi-Person Device-Free Gesture Recognition Using mmWave Signals
    Wang, Jie
    Ran, Zhouhua
    Gao, Qinghua
    Ma, Xiaorui
    Pan, Miao
    Xue, Kaiping
    CHINA COMMUNICATIONS, 2021, 18 (02) : 186 - 199
  • [7] Trajectory Features-Based Robust Device-Free Gesture Recognition Using mmWave Signals
    Wu, Jingmiao
    Wang, Jie
    Dai, Tong
    Gao, Qinghua
    Pan, Miao
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (10): : 18123 - 18135
  • [8] Toward Device-Free Micro-Gesture Tracking via Accurate Acoustic Doppler-Shift Detection
    Liu, Wenyuan
    Shen, Weihang
    Li, Binbin
    Wang, Lin
    IEEE ACCESS, 2019, 7 : 1084 - 1094
  • [9] EchoTrack: Acoustic Device-free Hand Tracking on Smart Phones
    Chen, Huijie
    Li, Fan
    Wang, Yu
    IEEE INFOCOM 2017 - IEEE CONFERENCE ON COMPUTER COMMUNICATIONS, 2017,
  • [10] Device-Free Human Detection Using WiFi Signals
    Li, Chu-Chen
    Fang, Shih-Hau
    2016 IEEE 5TH GLOBAL CONFERENCE ON CONSUMER ELECTRONICS, 2016,