Development of a customizable interactions questionnaire (CIQ) for evaluating interactions with objects in augmented/virtual reality

被引:6
|
作者
Gao, Meiyuzi [1 ]
Boehm-Davis, Deborah A. [1 ]
机构
[1] Meta, Redmond, WA 98052 USA
关键词
Usability testing; Questionnaire; Metrics; User studies;
D O I
10.1007/s10055-022-00678-8
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
As new methods for interacting with systems are being developed for use within augmented or virtual reality, their impact on the quality of the user's experience needs to be assessed. Although many instruments exist for evaluating the overall user experience or the computer interface used to complete tasks, few provide measures that can be used to evaluate the specific forms of interaction typically used in these environments. This paper describes the development of a customizable questionnaire for measuring the subjective user experience that focuses on the quality of the interactions with objects in augmented reality/virtual reality (AR/VR) worlds, which we are calling the Customizable Interactions Questionnaire, or (CIQ). The final questionnaire measures five factors that are related to user satisfaction while using the system: quality of interactions, assessment of task performance, comfort, quality of sensory enhancements, and consistency with expectations.
引用
收藏
页码:699 / 716
页数:18
相关论文
共 50 条
  • [1] Development of a customizable interactions questionnaire (CIQ) for evaluating interactions with objects in augmented/virtual reality
    Meiyuzi Gao
    Deborah A. Boehm-Davis
    Virtual Reality, 2023, 27 : 699 - 716
  • [2] Interactions with 3D virtual objects in augmented reality using natural gestures
    Dash, Ajaya Kumar
    Balaji, Koniki Venkata
    Dogra, Debi Prosad
    Kim, Byung-Gyu
    VISUAL COMPUTER, 2024, 40 (09): : 6449 - 6462
  • [3] Silhouettes from Real Objects Enable Realistic Interactions with a Virtual Human in Mobile Augmented Reality
    Kim, Hanseob
    Ali, Ghazanfar
    Pastor, Andreas
    Lee, Myungho
    Kim, Gerard J.
    Hwang, Jae-In
    APPLIED SCIENCES-BASEL, 2021, 11 (06):
  • [4] Multimodal Human Machine Interactions in Virtual and Augmented Reality
    Chollet, Gerard
    Esposito, Anna
    Gentes, Annie
    Horain, Patrick
    Karam, Walid
    Li, Zhenbo
    Pelachaud, Catherine
    Perrot, Patrick
    Petrovska-Delacretaz, Dijana
    Zhou, Dianle
    Zouari, Leila
    MULTIMODAL SIGNAL: COGNITIVE AND ALGORITHMIC ISSUES, 2009, 5398 : 1 - +
  • [5] Material Recognition for Immersive Interactions in Virtual/Augmented Reality
    Heng, Yuwen
    Dasmahapatra, Srinandan
    Kim, Hansung
    2023 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES ABSTRACTS AND WORKSHOPS, VRW, 2023, : 577 - 578
  • [6] Hidden Surface Removal for Interactions between User's Bare Hands and Virtual Objects in Augmented Reality
    Ishizu, Takahiro
    Sakamoto, Makoto
    Sakoma, Kenji
    Shinoda, Takahiro
    Takei, Amane
    Ito, Takao
    PROCEEDINGS OF THE 2020 INTERNATIONAL CONFERENCE ON ARTIFICIAL LIFE AND ROBOTICS (ICAROB2020), 2020, : 728 - 731
  • [7] Evaluating User Preferences for Augmented Reality Interactions with the Internet of Things
    Chopra, Shreya
    Maurer, Frank
    PROCEEDINGS OF THE WORKING CONFERENCE ON ADVANCED VISUAL INTERFACES AVI 2020, 2020,
  • [8] Effect of science students interactions with educational resources in augmented reality virtual for spatial visualization development
    Herpich, Fabricio
    da Silva, Patricia Fernandez
    Rockenbach Tarouco, Liane Margarida
    REVISTA LATINOAMERICANA DE TECNOLOGIA EDUCATIVA-RELATEC, 2021, 20 (02): : 29 - 47
  • [9] Mediating Human-Robot Interactions with Virtual, Augmented, and Mixed Reality
    Szafir, Daniel
    VIRTUAL, AUGMENTED AND MIXED REALITY: APPLICATIONS AND CASE STUDIES, VAMR 2019, PT II, 2019, 11575 : 124 - 149
  • [10] A Pilot Study Comparing User Interactions Between Augmented and Virtual Reality
    Williams, Adam S.
    Zhou, Xiaoyan
    Batmaz, Anil Ufuk
    Pahud, Michel
    Ortega, Francisco
    ADVANCES IN VISUAL COMPUTING, ISVC 2023, PT II, 2023, 14362 : 3 - 14