IndexPen: Two-Finger Text Input with Millimeter-Wave Radar

被引:12
|
作者
Wei, Haowen [1 ]
Li, Ziheng [2 ]
Galvan, Alexander D. [1 ]
Su, Zhuoran [1 ]
Zhang, Xiao [1 ]
Pahlavan, Kaveh [1 ]
Solovey, Erin T. [1 ]
机构
[1] Worcester Polytech Inst, 100 Inst Rd, Worcester, MA 01609 USA
[2] Columbia Univ, New York, NY 10027 USA
来源
PROCEEDINGS OF THE ACM ON INTERACTIVE MOBILE WEARABLE AND UBIQUITOUS TECHNOLOGIES-IMWUT | 2022年 / 6卷 / 02期
关键词
Millimeter wave FMCW radar; Micro-gesture sensing; In-air gestures; Deep Learning; Text input; Cursor interaction; GESTURE RECOGNITION;
D O I
10.1145/3534601
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we introduce IndexPen, a novel interaction technique for text input through two-finger in-air micro-gestures, enabling touch-free, effortless, tracking-based interaction, designed to mirror real-world writing. Our system is based on millimeter-wave radar sensing, and does not require instrumentation on the user. IndexPen can successfully identify 30 distinct gestures, representing the letters A-Z, as well as Space, Backspace, Enter, and a special Activation gesture to prevent unintentional input. Additionally, we include a noise class to differentiate gesture and non-gesture noise. We present our system design, including the radio frequency (RF) processing pipeline, classification model, and real-time detection algorithms. We further demonstrate our proof-of-concept system with data collected over ten days with five participants yielding 95.89% cross-validation accuracy on 31 classes (including noise). Moreover, we explore the learnability and adaptability of our system for real-world text input with 16 participants who are first-time users to IndexPen over five sessions. After each session, the pre-trained model from the previous five-user study is calibrated on the data collected so far for a new user through transfer learning. The F-1 score showed an average increase of 9.14% per session with the calibration, reaching an average of 88.3% on the last session across the 16 users. Meanwhile, we show that the users can type sentences with IndexPen at 86.2% accuracy, measured by string similarity. This work builds a foundation and vision for future interaction interfaces that could be enabled with this paradigm.
引用
收藏
页数:39
相关论文
共 37 条
  • [31] 3-D Object Detection for Multiframe 4-D Automotive Millimeter-Wave Radar Point Cloud
    Tan, Bin
    Ma, Zhixiong
    Zhu, Xichan
    Li, Sen
    Zheng, Lianqing
    Chen, Sihan
    Huang, Libo
    Bai, Jie
    IEEE SENSORS JOURNAL, 2023, 23 (11) : 11125 - 11138
  • [32] MMPoint-GNN: Graph Neural Network with Dynamic Edges for Human Activity Recognition through a Millimeter-wave Radar
    Gong, Peixian
    Wang, Chunyu
    Zhang, Lihua
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [33] DCS-CTN: Subtle Gesture Recognition Based on TD-CNN-Transformer via Millimeter-Wave Radar
    Wang, Congming
    Zhao, Xiaohui
    Li, Zan
    IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (20) : 17680 - 17693
  • [34] FEATURE RECONSTRUCTION FOR MULTI-HAND GESTURE SIGNALS SEPARATION BASED ON ENHANCED MUSIC USING MILLIMETER-WAVE RADAR
    Tian, Yu
    Cao, Zongjie
    Deng, Yijie
    Li, Jingyu
    Cui, Zongyong
    2024 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS 2024), 2024, : 7440 - 7443
  • [35] MS-YOLO: Object Detection Based on YOLOv5 Optimized Fusion Millimeter-Wave Radar and Machine Vision
    Song, Yunyun
    Xie, Zhengyu
    Wang, Xinwei
    Zou, Yingquan
    IEEE SENSORS JOURNAL, 2022, 22 (15) : 15435 - 15447
  • [36] Deep Learning Derived Object Detection and Tracking Technology Based on Sensor Fusion of Millimeter-Wave Radar/Video and Its Application on Embedded Systems
    Lin, Jia-Jheng
    Guo, Jiun-In
    Shivanna, Vinay Malligere
    Chang, Ssu-Yuan
    SENSORS, 2023, 23 (05)
  • [37] Interference-Robust Millimeter-Wave Radar-Based Dynamic Hand Gesture Recognition Using 2-D CNN-Transformer Networks
    Jin, Biao
    Ma, Xiao
    Zhang, Zhenkai
    Lian, Zhuxian
    Wang, Biao
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (02) : 2741 - 2752