Variable Step Sizes for Iterative Jacobian-Based Inverse Kinematics of Robotic Manipulators

被引:4
作者
Colan, Jacinto [1 ]
Davila, Ana [2 ]
Hasegawa, Yasuhisa [2 ]
机构
[1] Nagoya Univ, Dept Micronano Mech Sci & Engn, Nagoya 4648603, Japan
[2] Nagoya Univ, Inst Innovat Future Soc, Nagoya 4648601, Japan
基金
日本学术振兴会; 日本科学技术振兴机构;
关键词
Kinematics; Jacobian matrices; Robots; Optimization; Iterative methods; End effectors; Manipulator dynamics; Sampling methods; Nonlinear systems; Robotic manipulators; inverse kinematics; variable step size; random sampling; nonlinear optimization; iterative methods; EFFICIENT GRADIENT-METHOD; STEEPEST DESCENT; MINIMIZATION; ALGORITHM;
D O I
10.1109/ACCESS.2024.3418206
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This study evaluates the impact of step size selection on Jacobian-based inverse kinematics (IK) for robotic manipulators. Although traditional constant step size approaches offer simplicity, they often exhibit limitations in convergence speed and performance. To address these challenges, we propose and evaluate novel variable step size strategies. Our work explores three approaches: gradient-based dynamic selection, cyclic alternation, and random sampling techniques. We conducted extensive experiments on various manipulator kinematic chains and IK algorithms to demonstrate the benefits of these approaches. In particular, variable step sizes randomly derived from a normal distribution consistently improve solve rates across all evaluated cases compared to constant step sizes. Incorporating random restarts further enhances performance, effectively mitigating the effect of local minima. Our results suggest that variable step size strategies can improve the performance of Jacobian-based IK methods for robotic manipulators and have potential applications in other nonlinear optimization problems.
引用
收藏
页码:87909 / 87922
页数:14
相关论文
共 56 条
[31]  
Gupta K., 1985, Robotics and Automation, P743
[32]  
Lee YH, 2023, Arxiv, DOI [arXiv:2210.11466, 10.48550/arXiv.2210.11466]
[33]   An efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problem [J].
Liu, Zexian ;
Liu, Hongwei ;
Dong, Xiaoliang .
OPTIMIZATION, 2018, 67 (03) :427-440
[34]   AN ALGORITHM FOR LEAST-SQUARES ESTIMATION OF NONLINEAR PARAMETERS [J].
MARQUARDT, DW .
JOURNAL OF THE SOCIETY FOR INDUSTRIAL AND APPLIED MATHEMATICS, 1963, 11 (02) :431-441
[35]  
Nakamura Y., 1990, Advanced Robotics: Redundancy and Optimization, V1st ed.
[36]   Gradient methods for minimizing composite functions [J].
Nesterov, Yu .
MATHEMATICAL PROGRAMMING, 2013, 140 (01) :125-161
[37]  
Paquette C, 2018, Arxiv, DOI arXiv:1807.07994
[38]   Damped least square based genetic algorithm with Ggaussian distribution of damping factor for singularity-robust inverse kinematics [J].
Phuoc, Le Minh ;
Martinet, Philippe ;
Lee, Sukhan ;
Kim, Hunmo .
JOURNAL OF MECHANICAL SCIENCE AND TECHNOLOGY, 2008, 22 (07) :1330-1338
[39]   Relaxed steepest descent and Cauchy-Barzilai-Borwein method [J].
Raydan, M ;
Svaiter, BF .
COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2002, 21 (02) :155-167
[40]   OPTIMIZED RELATIVE STEP SIZE RANDOM SEARCHES [J].
SCHRACK, G ;
CHOIT, M .
MATHEMATICAL PROGRAMMING, 1976, 10 (02) :230-244