The Common Characteristics of User-Defined and Mid-Air Gestures for Rotating 3D Digital Contents

被引:2
|
作者
Chen, Li-Chieh [1 ]
Cheng, Yun-Maw [2 ]
Chu, Po-Ying [1 ]
Sandnes, Frode Eika [3 ]
机构
[1] Tatung Univ, Dept Ind Design, Taipei, Taiwan
[2] Tatung Univ, Grad Inst Design Sci, Dept Comp Sci & Engn, Taipei, Taiwan
[3] Oslo & Akershus Univ, Coll Appl Sci, Oslo, Norway
来源
UNIVERSAL ACCESS IN HUMAN-COMPUTER INTERACTION; INTERACTION TECHNIQUES AND ENVIRONMENTS, PT II | 2016年 / 9738卷
关键词
Mid-air gesture; User-defined gesture; 3D digital content rotation; INTERFACE; SYSTEM;
D O I
10.1007/978-3-319-40244-4_2
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Recently, the technology of mid-air gestures for manipulating 3D digital contents has become an important research issue. In order to conform to the needs of users and contexts, eliciting user-defined gestures is inevitable. However, it was reported that user-defined hand gestures tended to vary significantly in posture, motion and speed, making it difficult to identify common characteristics. In this research, the authors conducted an experiment to study the intuitive hand gestures for controlling the rotation of 3D digital furniture. Twenty graduate students majored in Industrial Design were invited to participate in the task. Although there were great varieties among different participants, common characteristics were extracted through systematic behavior coding and analysis. The results indicated that open palm and D Handshape (American Sign Language) were the most intuitive hand poses. In addition, moving hands along the circumference of a horizontal circle was the most intuitive hand motion and trajectory.
引用
收藏
页码:15 / 22
页数:8
相关论文
共 47 条
  • [1] User-defined mid-air gestures for multiscale GIS interface interaction
    Zhou, Xiaozhou
    Bai, Ruidong
    CARTOGRAPHY AND GEOGRAPHIC INFORMATION SCIENCE, 2023, 50 (05) : 481 - 494
  • [2] User-Defined Interaction for Smart Homes: Voice, Touch, or Mid-Air Gestures?
    Hoffmann, Fabian
    Tyroller, Miriam-Ida
    Wende, Felix
    Henze, Niels
    MUM 2019: 18TH INTERNATIONAL CONFERENCE ON MOBILE AND UBIQUITOUS MULTIMEDIA, 2019,
  • [3] Exploring the Ergonomic Issues of User-Defined Mid-Air Gestures for Interactive Product Exhibition
    Chen, Li-Chieh
    Chu, Po-Ying
    Cheng, Yun-Maw
    DISTRIBUTED, AMBIENT AND PERVASIVE INTERACTIONS, (DAPI 2016), 2016, 9749 : 180 - 190
  • [4] Eliciting User-Defined Touch and Mid-air Gestures for Co-located Mobile Gaming
    Ng C.
    Marquardt N.
    Proceedings of the ACM on Human-Computer Interaction, 2022, 6 (ISS):
  • [5] Exploring a user-defined gesture vocabulary for descriptive mid-air interactions
    Hessam Jahani
    Manolya Kavakli
    Cognition, Technology & Work, 2018, 20 : 11 - 22
  • [6] Exploring a user-defined gesture vocabulary for descriptive mid-air interactions
    Jahani, Hessam
    Kavakli, Manolya
    COGNITION TECHNOLOGY & WORK, 2018, 20 (01) : 11 - 22
  • [7] Understanding User-Defined Mapping Design in Mid-Air Musical Performance
    Brown, Dom
    Nash, Chris
    Mitchell, Tom
    PROCEEDINGS OF THE 5TH INTERNATIONAL CONFERENCE ON MOVEMENT AND COMPUTING (MOCO'18), 2018,
  • [8] Creating 3D/Mid-air Gestures Design Considerations for User-Centered Approach
    Othman, Nur Zuraifah Syazrah
    Rahim, Mohd Shafry Mohd
    Ghazali, Masitah
    Anjomshoae, Sule T.
    2016 INTERNATIONAL CONFERENCE ON ADVANCED INFORMATICS - CONCEPTS, THEORY AND APPLICATION (ICAICTA), 2016,
  • [9] User-Defined Gestures for Mid-Air Interaction: A Comparison of Upper Limb Muscle Activity, Wrist Kinematics, and Subjective Preference
    Huang, Jinghua
    Qi, Mengyao
    Mao, Lujin
    An, Ming
    Ji, Tiancheng
    Han, Runze
    INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 2021, 37 (16) : 1516 - 1537
  • [10] The Effect of User-Defined Mid-Air Gestures Elicited by On-Screen Visual Properties on Human Biomechanics, Behaviour, and Perception
    Huang, Jinghua
    Wang, Mingyan
    Mao, Lujin
    Wang, Ruobiao
    Zhang, Dongliang
    Qi, Mengyao
    Ke, Miaomiao
    INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 2024,