An Approach to Free-Hand Interaction for 3D Scene Modeling

被引:0
|
作者
Wan H.-G. [1 ,2 ]
Li T. [2 ]
Feng L.-W. [2 ]
Chen Y.-S. [3 ]
机构
[1] Center for Psychological Sciences, Zhejiang University, Hangzhou, 310027, Zhejiang
[2] State Key Lab of CAD&CG, Zhejiang University, Hangzhou, 310058, Zhejiang
[3] Department of Computer Science, The University of Hong Kong
来源
Beijing Ligong Daxue Xuebao/Transaction of Beijing Institute of Technology | 2019年 / 39卷 / 02期
关键词
3D modeling; Gesture interaction; Gesture recognition; Gesture set; Interactive framework;
D O I
10.15918/j.tbit1001-0645.2019.02.011
中图分类号
学科分类号
摘要
To solve the problem of a free-hand interaction method in 3D scene modeling, some possible solutions were put forward from three levels for the key problems of scene modeling. On the macroscopic level, an interaction frame was established based on user activities and system tasks. On the middle level, a method was proposed to construct a complete and practical gesture set, and then a gesture set was designed for the system based on the proposed method. On the microcosmic level, to solve the "Midas touch" problem, an optimal method was put forward from the aspects of time, space, and gesture assistance to improve the recognition rate of gestures. Finally, experiment was carried out to verify the availability of proposed methods, providing a future research direction. © 2019, Editorial Department of Transaction of Beijing Institute of Technology. All right reserved.
引用
收藏
页码:175 / 180
页数:5
相关论文
共 21 条
  • [1] Norman K., Cyberpsychology: an Introduction to Human-computer Interaction, (2017)
  • [2] Wang Q., Ren X., Sun X., Enhancing pen-based interaction using electrovibration and vibration haptic feedback, Proceeding of CHI Conference on Human Factors in Computing Systems, pp. 3746-3750, (2017)
  • [3] Mishra R.K., Hubble L.J., Martin A., Et al., Wearable flexible and stretchable glove biosensor for on-site detection of organophosphorus chemical threats, ACS Sensors, 2, 4, pp. 553-561, (2017)
  • [4] Biswas K.K., Basu S.K., Gesture recognition using Microsoft Kinect<sup>®</sup>, Proceedings of International Conference on Automation, Robotics and Applications, pp. 100-103, (2012)
  • [5] Yoo B.I., Han J.J., Choi C., Et al., 3D user interface combining gaze and hand gestures for large-scale display, Proceedings of CHI'10 Extended Abstracts on Human Factors in Computing Systems, pp. 3709-3714, (2010)
  • [6] Yang C., Jang Y., Beh J., Et al., Gesture recognition using depth-based hand tracking for contactless controller application, Proceedings of IEEE International Conference on Consumer Electronics, pp. 297-298, (2012)
  • [7] Konda K., Schulz H., Konigs A., Et al., Real time interaction with mobile robots using hand gestures, Proceedings of ACM/IEEE International Conference on Human-Robot Interaction, pp. 177-178, (2012)
  • [8] Keskin C., Kirac F., Kara Y.E., Et al., Real time hand pose estimation using depth sensors, Proceedings of IEEE International Conference on Computer Vision Workshops, pp. 1228-1234, (2011)
  • [9] Pavlovic V., Sharma R., Huang T.S., Visual interpretation of hand gestures for human-computer interaction: a review, IEEE Transactions On Pattern Analysis & Machine Intelligence, 19, 7, pp. 677-695, (1997)
  • [10] Jacob R.J., Eye movement-based human-computer interaction techniques: toward non-command interfaces, Proceedings of International Conference on Human-Computer Interaction, (1993)