Multi-modal Interaction System for Enhanced User Experience

被引:0
|
作者
Jeong, Yong Mu [2 ]
Min, Soo Young [2 ]
Lee, Seung Eun [1 ]
机构
[1] Seoul Natl Univ Sci & Technol, Dept Elect Engn, Seoul, South Korea
[2] Korea Elect Technol Inst, SW Device Res Ctr, Songnam, Gyeonggi Do, South Korea
来源
COMPUTER APPLICATIONS FOR WEB, HUMAN COMPUTER INTERACTION, SIGNAL AND IMAGE PROCESSING AND PATTERN RECOGNITION | 2012年 / 342卷
关键词
Gesture Recognition; Realistic Interaction; User Interface;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose a gesture recognition platform for a realistic game. Our Platform provides more effective experience compare to a conventional devices such as a keyboard and a mouse. It consists of one haptic glove (mGlove), four advanced sensing devices (ASD), Gaze tracking module and communication module (zigbee). The communication module establishes the communication channels among the game server, ASD, and mGlove. We demonstrated that a game player can gain full control of an avatar in the realistic game by using the body gestures, enhancing the user experience in the game platform.
引用
收藏
页码:287 / +
页数:3
相关论文
共 50 条
  • [1] Powervis: Empowering the user with a multi-modal visualization system
    Minghim, R
    deOliveira, MCF
    II WORKSHOP ON CYBERNETIC VISION, PROCEEDINGS, 1997, : 106 - 111
  • [2] A Group Recommendation System with a Multi-modal User Interface
    Yukawa, Masataka
    Hayashi, Yugo
    Ogawa, Hitoshi
    Kryssanov, Victor V.
    6TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING AND INTELLIGENT SYSTEMS, AND THE 13TH INTERNATIONAL SYMPOSIUM ON ADVANCED INTELLIGENT SYSTEMS, 2012, : 2158 - 2163
  • [3] Development of a Multi-modal Multi-user Telepresence and Teleaction System
    Buss, M.
    Peer, A.
    Schauss, T.
    Stefanov, N.
    Unterhinninghofen, U.
    Behrendt, S.
    Leupold, J.
    Durkovic, M.
    Sarkis, M.
    INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2010, 29 (10): : 1298 - 1316
  • [4] Natural User Interfaces for Human-Drone Multi-Modal Interaction
    Suarez Fernandez, Ramon A.
    Luis Sanchez-Lopez, Jose
    Sampedro, Carlos
    Bavle, Hriday
    Molina, Martin
    Campoy, Pascual
    2016 INTERNATIONAL CONFERENCE ON UNMANNED AIRCRAFT SYSTEMS (ICUAS), 2016, : 1013 - 1022
  • [5] A Cognitive User Interface for a Multi-modal Human-Machine Interaction
    Tschoepe, Constanze
    Duckhorn, Frank
    Huber, Markus
    Meyer, Werner
    Wolff, Matthias
    SPEECH AND COMPUTER (SPECOM 2018), 2018, 11096 : 707 - 717
  • [6] Multi-modal user identification and object recognition surveillance system
    Clapes, Albert
    Reyes, Miguel
    Escalera, Sergio
    PATTERN RECOGNITION LETTERS, 2013, 34 (07) : 799 - 808
  • [7] A navigation system for a wheelchair user based on a multi-modal design
    Feziri, Mohamed
    Attoui, Hamza
    2007 14TH IEEE INTERNATIONAL CONFERENCE ON ELECTRONICS, CIRCUITS AND SYSTEMS, VOLS 1-4, 2007, : 479 - 482
  • [8] A secure and user-friendly multi-modal biometric system
    Takahashi, K
    Mimura, M
    Isobe, Y
    Seto, Y
    BIOMETRIC TECHNOLOGY FOR HUMAN IDENTIFICATION, 2004, 5404 : 12 - 19
  • [9] Enhanced Entity Interaction Modeling for Multi-Modal Entity Alignment
    Li, Jinxu
    Zhou, Qian
    Chen, Wei
    Zhao, Lei
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT II, KSEM 2023, 2023, 14118 : 214 - 227
  • [10] A Realistic Game System Using Multi-Modal User Interfaces
    Heo, Hwan
    Lee, Eui Chul
    Park, Kang Ryoung
    Kim, Chi Jung
    Whang, Mincheol
    IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, 2010, 56 (03) : 1364 - 1372