Encountered-type visual haptic display using flexible sheet

被引:7
|
作者
Furukawa, Tsuyoshi [1 ]
Inoue, Kenji [1 ]
Takubo, Tomohito [1 ]
Arai, Tatsuo [1 ]
机构
[1] Osaka Univ, Grad Sch Engn Sci, Dept Syst Innovat, 1-3 Machikaneyama, Osaka 5608531, Japan
来源
PROCEEDINGS OF THE 2007 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-10 | 2007年
关键词
D O I
10.1109/ROBOT.2007.363832
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
An encountered-type visual haptic display using flexible sheet is proposed; it allows users to feel like seeing and pushing virtual soft objects directly. Both edges of a translucent flexible sheet such as rubber is attached to two manipulators. They apply bias tension to the sheet by pulling it from both sides, thus varying the sheet compliance. A user can feel the compliance of a virtual soft object by pushing the sheet with his finger directly. The motion of the finger tip is measured by stereo cameras. The manipulators also change the pose of the sheet along the object's surface together with the finger tip motion. The user can touch different points of the object and feel like stroking it with his finger. This sheet is also used as a rear projection screen. From the measured finger tip position, the deformation of the object is calculated by FEM. An LCD projector projects the CG image of the deformed object stereoscopically on the sheet from its back. The user can see the 3D image through stereoscopic glasses and touch the image directly. A method of correcting the CG image distortion caused by the movement of the sheet and the depression of the pushed sheet is proposed. A virtual soft cylinder is expressed by a prototype display: the stroking of the cylinder and the correction of its 3D image are evaluated.
引用
收藏
页码:479 / 484
页数:6
相关论文
共 50 条
  • [1] Encountered-Type Visual Haptic Display Using MR Fluid
    Ohnari, Hiroki
    Abiko, Satoko
    Tsujita, Teppei
    HAPTIC INTERACTION: SCIENCE, ENGINEERING AND DESIGN, 2018, 432 : 151 - 155
  • [2] Visual Guidance for a Spatial Discrepancy Problem of in Encountered-Type Haptic Display
    Lee, Chang-Gyu
    Dunn, Gregory Lynn
    Oakley, Ian
    Ryu, Jeha
    IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2020, 50 (04): : 1384 - 1394
  • [3] A Non-grounded and Encountered-type Haptic Display Using a Drone
    Yamaguchi, Kotaro
    Kato, Ginga
    Kuroda, Yoshihiro
    Kiyokawa, Kiyoshi
    Takemura, Haruo
    SUI'16: PROCEEDINGS OF THE 2016 SYMPOSIUM ON SPATIAL USER INTERACTION, 2016, : 43 - 46
  • [4] Sensory Evaluation of Cutting Force for Encountered-type Haptic Display Using MR Fluid
    Ohnari, Hiroki
    Abiko, Satoko
    Tsujita, Teppei
    2018 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (ROBIO), 2018, : 2399 - 2404
  • [5] Synthesizing the Roughness of Textured Surfaces for an Encountered-Type Haptic Display Using Spatiotemporal Encoding
    Kim, Yaesol
    Kim, Siyeon
    Oh, Uran
    Kim, Young J.
    IEEE TRANSACTIONS ON HAPTICS, 2021, 14 (01) : 32 - 43
  • [6] Effects of Physical Hardness on the Perception of Rendered Stiffness in an Encountered-Type Haptic Display
    Zamani, Naghmeh
    Culbertson, Heather
    IEEE TRANSACTIONS ON HAPTICS, 2023, 16 (01) : 46 - 56
  • [7] Haptic Rendering of Curved Surface by Bending an Encountered-Type Flexible Plate
    Jeon, Seokhee
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2016, E99D (07): : 1862 - 1870
  • [8] Encountered-type haptic display for large VR environment using per-plane reachability maps
    Kim, Yaesol
    Kim, Hyun Jung
    Kim, Young J.
    COMPUTER ANIMATION AND VIRTUAL WORLDS, 2018, 29 (3-4)
  • [9] Development of Integrated Visual Haptic Display Using Translucent Flexible Sheet
    Inoue, Kenji
    Uesugi, Reiko
    Sasama, Ryouhei
    Arai, Tatsuo
    Mae, Yasushi
    JOURNAL OF ROBOTICS AND MECHATRONICS, 2005, 17 (03) : 302 - 309
  • [10] Visual Guidance for Encountered Type Haptic Display: A feasibility study
    Lee, Chang-Gyu
    Dunn, Gregory Lynn
    Oakley, Ian
    Ryu, Jeha
    ADJUNCT PROCEEDINGS OF THE 2016 IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY (ISMAR-ADJUNCT), 2016, : 74 - 77