Handheld tangible interface for enhanced depth information transfer of 3D virtual object

被引:0
作者
Han B.-K. [1 ]
Kim S.C. [2 ]
Kwon D.S. [3 ]
Choi T.-Y. [1 ]
Kim H.-S. [1 ]
Kyung J. [1 ]
Kim D.-H. [1 ]
机构
[1] Department of Robotics and Mechatronics, Korea Institute of Machinery and Materials
[2] Department of ICT Convergence, Hallym University
[3] Department of Mechanical Engineering, KAIST
关键词
Active contour model; Robotic interface; Shape changing interface;
D O I
10.5302/J.ICROS.2020.20.0162
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we propose a novel three-dimensional interaction system based on a shape-changeable mobile interface. We utilize multiple serially linked line segments to physically collocate virtual objects in real space. More specifically, the proposed system provides users with geometric information by physically enclosing the target virtual object with its outer shape. To this end, we further propose an algorithm that controls each joint of the system, such that the corresponding links are aligned with the virtual surface, based on an active-contour model. An experiment was conducted to verify the proposed interaction scheme, wherein geometric information was provided in the form of mechanical shape change of the interface. Our experimental results indicate that the proposed method is effective as a system for interacting with 3D virtual objects, provided that only a mobile interface is used. © ICROS 2020.
引用
收藏
页码:1047 / 1053
页数:6
相关论文
共 21 条
[1]  
Song H., Kim S., Park S., A study on E2E deep learning based autonomous navigation robot from virtual environment to real environment using transfer leaning, Journal of Institute of Control, Robotics and Systems (in Korean), 25, 5, pp. 381-387, (2019)
[2]  
Song H., Han J., Hong H., Jung S., Kim S., Won M., Yoo W., Joo S., A virtual unmanned ground vehicle simulator for verifying the algorithm based on real-time traversability analysis, Journal of Institute of Control, Robotics and Systems (in Korean), 24, 4, pp. 354-360, (2018)
[3]  
Peng H., Briggs J., Wang C., Guo K., Kider J., Mueller S., Baudisch I.P., Gumbretiere F., RoMA: Interactive fabrication with augmented reality and a robotic 3D printer, Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, (2018)
[4]  
Bordegoni M., Ferrise F., Covarrubias M., Antolini M., Geodesic Spline Interface for Haptic Curve Rendering, IEEE transactions on haptics, 4, 2, pp. 111-121, (2011)
[5]  
Chen C., Pan Y., Li D., Zhang S., Zhao Z., Hong J., A virtual-physical collision detection interface for AR-based interactive teaching of robot, Robotics and Computer-Integrated Manufacturing, 64, (2020)
[6]  
Pawluk D., Kitada R., Abramowicz A., Hamilton C., Lederman S.J., Figure/ground segmentation via a haptic glance: Attributing initial finger contacts to objects or their supporting surfaces, IEEE Transactions on Haptics, 4, 1, pp. 2-13, (2011)
[7]  
Plaisier M., Tiest W., Kappers A., Haptic object indivi-duation, IEEE Transactions on Haptics, 3, 4, pp. 257-265, (2010)
[8]  
Iwata H., Yano H., Nakaizumi F., Kawamura R., Project FEELEX: Adding haptic surface to graphics, Proc. of the 28th Annual Conference on Computer Graphics and Interactive Techniques, pp. 469-476, (2001)
[9]  
Poupyrev I., Nashida T., Okabe M., Actuation and tangible user interfaces: The Vaucanson duck, robots, and shape displays, Proc. of the 1st International Conference on Tangible and Embedded Interaction, pp. 205-212, (2007)
[10]  
Pangaro G., Maynes-Aminzade D., Ishii H., The actuated workbench: Computer-controlled actuation in tabletop tangible interfaces, ACM Transactions on Graphics, 22, 3, pp. 699-699, (2003)