GAN-based image-to-friction generation for tactile simulation of fabric material

被引:24
作者
Cai, Shaoyu [1 ]
Zhao, Lu [2 ]
Ban, Yuki [3 ]
Narumi, Takuji [4 ]
Liu, Yue [2 ,5 ]
Zhu, Kening [1 ,6 ]
机构
[1] City Univ Hong Kong, Sch Creat Media, Hong Kong, Peoples R China
[2] Beijing Inst Technol, Sch Opt & Photon, Beijing Engn Res Ctr Mixed Real & Adv Display, Beijing, Peoples R China
[3] Univ Tokyo, Grad Sch Frontier Sci, Chiba, Japan
[4] Univ Tokyo, Grad Sch Informat Sci & Technol, Tokyo, Japan
[5] AICFVE Beijing Film Acad, Beijing, Peoples R China
[6] City Univ Hong Kong, Shenzhen Res Inst, Shenzhen, Peoples R China
来源
COMPUTERS & GRAPHICS-UK | 2022年 / 102卷
基金
中国国家自然科学基金;
关键词
Supervised learning; Generative adversarial networks (GANs); Haptic rendering; Electrovibration surface; Tactile simulation; Fabrics; PERCEPTION; FEEDBACK; TOUCH;
D O I
10.1016/j.cag.2021.09.007
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
The electrovibration tactile display could render the tactile feeling of different textured surfaces by generating the frictional force through voltage modulation. When a user is sliding his/her finger on the display surface, he/she can feel the frictional texture. However, it is not trivial to prepare and fine-tune the appropriate frictional signals for haptic design and texture simulation. In this paper, we present a deep-learning-based framework to generate the frictional signals from the textured images of fabric materials. The generated frictional signal can be used for the tactile rendering on the electrovibration tactile display. Leveraging GANs (Generative Adversarial Networks), our system could generate the displacement-based data of frictional coefficients for the tactile display to simulate the tactile feedback of different fabric materials. Our experimental results show that the proposed generative model could generate the frictional-coefficient signals visually and statistically close to the ground-truth signals. The following user studies on fabric-texture simulation show that users could not discriminate the generated and the ground-truth frictional signals being rendered on the electrovibration tactile display, suggesting the effectiveness of our deep-frictional-signal-generation model. (C) 2021 Elsevier Ltd. All rights reserved.
引用
收藏
页码:460 / 473
页数:14
相关论文
共 69 条
[1]   HANDEDNESS AND PSYCHOPHYSICS - WEIGHT AND ROUGHNESS [J].
ARDILA, A ;
URIBE, BE ;
ANGEL, ME .
INTERNATIONAL JOURNAL OF NEUROSCIENCE, 1987, 36 (1-2) :17-21
[2]  
Arjovsky M., 2017, Towards principled methods for training generative adversarial networks
[3]   Toward quality texture display: vibrotactile stimuli to modify material roughness sensations [J].
Asano, S. ;
Okamoto, S. ;
Matsuura, Y. ;
Yamada, Y. .
ADVANCED ROBOTICS, 2014, 28 (16) :1079-1089
[4]  
Aytar Y, 2016, ARXIV PREPRINT ARXIV
[5]  
Bau O., 2010, P UIST 10, P283, DOI [DOI 10.1145/1866029.1866074, 10.1145/1866029.1866074]
[6]   NormalTouch and TextureTouch: High-fidelity 3D Haptic Shape Rendering on Handheld Virtual Reality Controllers [J].
Benko, Hrvoje ;
Holz, Christian ;
Sinclair, Mike ;
Ofek, Eyal .
UIST 2016: PROCEEDINGS OF THE 29TH ANNUAL SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOLOGY, 2016, :717-728
[7]   Perceptual Constancy in the Reproduction of Virtual Tactile Textures With Surface Displays [J].
Bochereau, Serena ;
Sinclair, Stephen ;
Hayward, Vincent .
ACM TRANSACTIONS ON APPLIED PERCEPTION, 2018, 15 (02)
[8]   A systematic study of the class imbalance problem in convolutional neural networks [J].
Buda, Mateusz ;
Maki, Atsuto ;
Mazurowski, Maciej A. .
NEURAL NETWORKS, 2018, 106 :249-259
[9]   A simulation from a tactile device to render the touch of textile fabrics: a preliminary study on velvet [J].
Bueno, Marie-Ange ;
Lemaire-Semail, Betty ;
Amberg, Michel ;
Giraud, Frederic .
TEXTILE RESEARCH JOURNAL, 2014, 84 (13) :1428-1440
[10]  
Cai S., 2020, ICAT EGVE, P11, DOI [10.2312/egve.20201254, DOI 10.2312/EGVE.20201254]