A Deep Learning Method Based on Triplet Network Using Self-Attention for Tactile Grasp Outcomes Prediction

被引:6
|
作者
Liu, Chengliang [1 ,2 ]
Yi, Zhengkun [1 ,2 ]
Huang, Binhua [1 ]
Zhou, Zhenning [1 ,2 ]
Fang, Senlin [1 ,3 ]
Li, Xiaoyu [1 ]
Zhang, Yupo [1 ]
Wu, Xinyu [1 ,2 ,4 ]
机构
[1] Chinese Acad Sci, Shenzhen Inst Adv Technol, Guangdong Prov Key Lab Robot & Intelligent Syst, Shenzhen 518055, Peoples R China
[2] Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing 100049, Peoples R China
[3] City Univ Macau, Fac Data Sci, Macau 999078, Peoples R China
[4] Shenzhen Inst Artificial Intelligence & Robot Soc, SIAT Branch, Shenzhen 518055, Peoples R China
基金
中国国家自然科学基金;
关键词
Index Terms-Contrastive learning; deep learning; grasping; self-attention; triplet network; SLIP;
D O I
10.1109/TIM.2023.3285986
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Recent research work has demonstrated that pregrasp tactile information can be used to effectively predict whether a grasp will be successful or not. However, most of the existing grasp prediction models do not perform satisfactorily with a small available dataset. In this article, we propose a deep network framework based on triplet network with self-attention mechanisms for grasp outcomes prediction. By forming the samples into contrasting triplets, our method can generate more sample units and discover potential connections between samples by contrasting with the triplet loss. In addition, the inclusion of the self-attention mechanisms helps capture the internal correlation of features, further improving the performance of the network. We also validate that the self-attention module works better as a nonlinear projection head for contrast learning than the multilayer perceptron module. Experimental results on the publicly available dataset show that the proposed framework is effective.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] Crowd counting method based on the self-attention residual network
    Yan-Bo Liu
    Rui-Sheng Jia
    Qing-Ming Liu
    Xing-Li Zhang
    Hong-Mei Sun
    Applied Intelligence, 2021, 51 : 427 - 440
  • [22] Self-attention based GRU neural network for deep knowledge tracing
    Jin, Shangzhu
    Zhao, Yan
    Peng, Jun
    Chen, Ning
    Xue, Run
    Liang, Minghui
    Jiang, Yunfeng
    2022 IEEE 17TH CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS (ICIEA), 2022, : 1436 - 1440
  • [23] A self-attention based deep learning method for lesion attribute detection from CT reports
    Peng, Yifan
    Yan, Ke
    Sandfort, Veit
    Summers, Ronald M.
    Lu, Zhiyong
    2019 IEEE INTERNATIONAL CONFERENCE ON HEALTHCARE INFORMATICS (ICHI), 2019, : 218 - 222
  • [24] Prediction of Material Properties of Inorganic Compounds Using Self-Attention Network
    Noda K.
    Takahashi H.
    Tsuda K.
    Hiroshima M.
    Transactions of the Japanese Society for Artificial Intelligence, 2023, 38 (02)
  • [25] Solar irradiance prediction based on self-attention recursive model network
    Kang, Ting
    Wang, Huaizhi
    Wu, Ting
    Peng, Jianchun
    Jiang, Hui
    Frontiers in Energy Research, 2022, 10
  • [26] A Deep Learning Method Based Self-Attention and Bi-directional LSTM in Emotion Classification
    Fei, Rong
    Zhu, Yuanbo
    Yao, Quanzhu
    Xu, Qingzheng
    Hu, Bo
    JOURNAL OF INTERNET TECHNOLOGY, 2020, 21 (05): : 1447 - 1461
  • [27] Full-field prediction of stress and fracture patterns in composites using deep learning and self-attention
    Chen, Yang
    Dodwell, Tim
    Chuaqui, Tomas
    Butler, Richard
    ENGINEERING FRACTURE MECHANICS, 2023, 286
  • [28] Deep & Attention : A Self-Attention based Neural Network for Remaining Useful Lifetime Predictions
    Li, Yuanjun
    Wang, Xingang
    2021 7TH INTERNATIONAL CONFERENCE ON MECHATRONICS AND ROBOTICS ENGINEERING (ICMRE 2021), 2021, : 98 - 105
  • [29] Solar irradiance prediction based on self-attention recursive model network
    Kang, Ting
    Wang, Huaizhi
    Wu, Ting
    Peng, Jianchun
    Jiang, Hui
    FRONTIERS IN ENERGY RESEARCH, 2022, 10
  • [30] DLSA: dual-learning based on self-attention for rating prediction
    Qian, Fulan
    Huang, Yafan
    Li, Jianhong
    Wang, Chengjun
    Zhao, Shu
    Zhang, Yanping
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2021, 12 (07) : 1993 - 2005