Real-Time Visual Measurement With Opponent Hitting Behavior for Table Tennis Robot

被引:17
|
作者
Zhang, Kun [1 ,2 ]
Cao, Zhiqiang [1 ,2 ]
Liu, Jianran [1 ,2 ]
Fang, Zaojun [3 ]
Tan, Min [1 ,2 ]
机构
[1] Chinese Acad Sci, Inst Automat, State Key Lab Management & Control Complex Syst, Beijing 100190, Peoples R China
[2] Univ Chinese Acad Sci, Beijing 100049, Peoples R China
[3] Chinese Acad Sci, Ningbo Inst Ind Technol, Ningbo 315201, Zhejiang, Peoples R China
基金
北京市自然科学基金; 中国国家自然科学基金;
关键词
Hitting point; opponent hitting behavior; table tennis robot; visual measurement; PING-PONG PLAYER; PREDICTION;
D O I
10.1109/TIM.2017.2789139
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In this paper, a visual-based method is proposed for the table tennis robot to estimate the hitting point from the opponent, which can provide a better understanding of the opponent, and thus improve the responsiveness of the table tennis robot. As an essential manifestation of opponent hitting behavior, the information of hitting point includes not only the motion status variation of the ball before and after being hit, but also the racket pose at the hitting moment. To solve this problem, the trajectories of the ball before and after being hit are first predicted based on visual measurement and the motion model of the ball. The racket trajectory of the opponent is then derived by a self-adaptive threshold selection scheme and a multifilter. Considering that the ball and racket trajectories are not absolutely precise, an optimized solution is proposed to compute the hitting point from the opponent. To the best of our knowledge, the proposed approach is the first one that could achieve a fast estimation of the hitting point from the opponent with a satisfactory resolution. The effectiveness of the proposed approach is verified by experiments.
引用
收藏
页码:811 / 820
页数:10
相关论文
共 50 条
  • [31] Table Tennis Training Results with Robot: Spin Rate and Hitting Speed in Forehand Loop-Drives
    Kovacs, Istvan
    McClinton, Austin
    Rauenzahn, Catherine
    Liu, Wenhao
    MEDICINE AND SCIENCE IN SPORTS AND EXERCISE, 2019, 51 (06): : 956 - 957
  • [32] A Learning Framework Towards Real-time Detection and Localization of a Ball for Robotic Table Tennis System
    Zhao, Yongsheng
    Wu, Jun
    Zhu, Yifeng
    Yu, Hongxiang
    Xiong, Rong
    2017 IEEE INTERNATIONAL CONFERENCE ON REAL-TIME COMPUTING AND ROBOTICS (RCAR), 2017, : 97 - 102
  • [33] Computational Analysis of Table Tennis Matches from Real-Time Videos Using Deep Learning
    Zhou, Hong
    Nguyen, Minh
    Yan, Wei Qi
    IMAGE AND VIDEO TECHNOLOGY, PSIVT 2023, 2024, 14403 : 69 - 81
  • [34] Feature Fusion based Efficient Convolution Network for Real-time Table Tennis Ball Detection
    Yang, Luo
    Sheng, Xinjun
    Zhu, Xiangyang
    Zhang, Haibo
    2020 13TH INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DESIGN (ISCID 2020), 2020, : 300 - 305
  • [35] Visual routines for real-time monitoring of vehicle behavior
    Marco Aste
    Massimo Rossi
    Roldano Cattoni
    Bruno Caprile
    Machine Vision and Applications, 1998, 11 : 16 - 23
  • [36] Visual routines for real-time monitoring of vehicle behavior
    Aste, M
    Rossi, M
    Cattoni, R
    Caprile, B
    MACHINE VISION AND APPLICATIONS, 1998, 11 (01) : 16 - 23
  • [37] Real-Time Robot Positioning based on Measurement Feedback Control
    Loser, Raimund
    Kleinkes, Michael
    SAE INTERNATIONAL JOURNAL OF MATERIALS AND MANUFACTURING, 2016, 9 (01) : 106 - 111
  • [38] Real-time pose measurement of parallel robot based on GRNN
    Guoqin, Gao
    Zhigang, Zhang
    Xuemei, Niu
    Telkomnika - Indonesian Journal of Electrical Engineering, 2013, 11 (05): : 2315 - 2322
  • [39] Real-Time Model Based Visual Servoing Tasks on a Humanoid Robot
    Abou Moughlbay, Amine
    Cervera, Enric
    Martinet, Philippe
    INTELLIGENT AUTONOMOUS SYSTEMS 12, VOL 1, 2013, 193 : 321 - +
  • [40] Real-Time Seam Tracking Technology of Welding Robot with Visual Sensing
    Hongyuan Shen
    Tao Lin
    Shanben Chen
    Laiping Li
    Journal of Intelligent & Robotic Systems, 2010, 59 : 283 - 298