Interactions with 3D virtual objects in augmented reality using natural gestures

被引:1
|
作者
Dash, Ajaya Kumar [1 ,2 ]
Balaji, Koniki Venkata [2 ]
Dogra, Debi Prosad [2 ]
Kim, Byung-Gyu [3 ]
机构
[1] IIIT Bhubaneswar, Dept Comp Sci, Bhubaneswar 751003, India
[2] IIT Bhubaneswar, Sch Elect Sci, Bhubaneswar 752050, India
[3] Sookmyung Womens Univ, Seoul, South Korea
来源
VISUAL COMPUTER | 2024年 / 40卷 / 09期
关键词
Augmented reality; Deep learning; Interaction with virtual objects; HAND POSE ESTIMATION;
D O I
10.1007/s00371-023-03175-4
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Markers are the backbone of various cross-domain augmented reality (AR) applications available to the research community. However, the use of markers may limit anywhere augmentation. As smart sensors are being deployed across the large spectrum of consumer electronic (CE) products, it is becoming inevitable to rely upon natural gestures to render and interact with such CE products. It provides limitless options for augmented reality applications. This paper focuses on the use of the human palm as the natural target to render 3D virtual objects and interact with the virtual objects in a typical AR set-up. While printed markers are comparatively easier to detect for camera pose estimation, palm detection can be challenging as a replacement for physical markers. To mitigate this, we have used a two-stage palm detection model that helps to track multiple palms and the related key-points in real-time. The detected key-points help to calculate the camera pose before rendering the 3D objects. After successfully rendering the virtual objects, we use intuitive, one-handed (uni-manual) natural gestures to interact with them. A finite state machine (FSM) has been proposed to detect the change in gestures during interactions. We have validated the proposed interaction framework using a few well-known 3D virtual objects that are often used to demonstrate scientific concepts to students in various grades. Our framework has been found to perform better as compared to SOTA methods. Average precision of 96.5% (82.9% SSD+Mobilenet) and FPS of 58.27 (37.93 SSD+Mobilenet) have been achieved. Also, to widen the scope of the work, we have used a versatile gesture dataset and tested it with neural network-based models to detect gestures. The approach fits perfectly into the proposed AR pipeline at 46.83 FPS to work in real-time. This reveals that the proposed method has good potential to mitigate some of the challenges faced by the research community in the interactive AR space.
引用
收藏
页码:6449 / 6462
页数:14
相关论文
共 50 条
  • [41] A Virtual Musical Instrument for 3D Performance with Short Gestures: Exploring Mapping Strategies with Virtual Reality
    Rodrigues, Andre Montes
    Zuffo, Marcelo Knorich
    Belloc, Olavo da Rosa
    Alves Faria, Regis Rossi
    BRIDGING PEOPLE AND SOUND, 2017, 10525 : 301 - 315
  • [42] Towards natural 3D interaction for laparoscopic augmented reality registration
    Joeres, Fabian
    Heinrich, Florian
    Schott, Danny
    Hansen, Christian
    COMPUTER METHODS IN BIOMECHANICS AND BIOMEDICAL ENGINEERING-IMAGING AND VISUALIZATION, 2021, 9 (04): : 384 - 391
  • [43] Poster: A Pilot Study on Stepwise 6-DoF Manipulation of Virtual 3D Objects using Smartphone in Wearable Augmented Reality Environment
    Ha, Taejin
    Woo, Woontack
    2013 IEEE SYMPOSIUM ON 3D USER INTERFACES (3DUI), 2013, : 137 - 138
  • [44] Augmented Reality for 3D construction
    Raajana, N. R.
    Suganya, S.
    Hemanand, R.
    Janani, S.
    Nandini, Sarada N. S.
    Ramanan, Sruthi V.
    INTERNATIONAL CONFERENCE ON MODELLING OPTIMIZATION AND COMPUTING, 2012, 38 : 66 - 72
  • [45] 3D Augmented Reality Marker Expands Workable Fields of Virtual Reality Action Games
    Usami, Makoto
    Miura, Kyohei
    Sugimura, Hiroshi
    Isshiki, Masao
    2015 IEEE 4TH GLOBAL CONFERENCE ON CONSUMER ELECTRONICS (GCCE), 2015, : 306 - 310
  • [46] User-elicited dual-hand interactions for manipulating 3D objects in virtual reality environments
    Nanjappan, Vijayakumar
    Liang, Hai-Ning
    Lu, Feiyu
    Papangelis, Konstantinos
    Yue, Yong
    Man, Ka Lok
    HUMAN-CENTRIC COMPUTING AND INFORMATION SCIENCES, 2018, 8
  • [47] Segmentation of underwater 3D acoustical images for augmented and virtual reality applications
    Giannitrapani, R
    Trucco, A
    Murino, V
    OCEANS '99 MTS/IEEE : RIDING THE CREST INTO THE 21ST CENTURY, VOLS 1-3, 1999, : 459 - 465
  • [48] 3D virtual models and augmented reality for radical prostatectomy: a narrative review
    Della Corte, Marcello
    Quara, Alberto
    De Cillis, Sabrina
    Volpi, Gabriele
    Amparore, Daniele
    Piramide, Federico
    Piana, Alberto
    Sica, Michele
    Di Dio, Michele
    Alba, Stefano
    Porpiglia, Francesco
    Checcucci, Enrico
    Fiori, Cristian
    CHINESE CLINICAL ONCOLOGY, 2024, 13 (04)
  • [49] Tangible authoring of 3D virtual scenes in dynamic augmented reality environment
    Lee, Jae Yeol
    Seo, Dong Woo
    Rhee, Gue Won
    COMPUTERS IN INDUSTRY, 2011, 62 (01) : 107 - 119
  • [50] 3D Virtual Reconstruction and Augmented Reality Visualization of Damaged Stone Sculptures
    Gherardini, Francesco
    Santachiara, Mattia
    Leali, Francesco
    FLORENCE HERI-TECH - THE FUTURE OF HERITAGE SCIENCE AND TECHNOLOGIES, 2018, 364