Variable Step Sizes for Iterative Jacobian-Based Inverse Kinematics of Robotic Manipulators

被引:4
作者
Colan, Jacinto [1 ]
Davila, Ana [2 ]
Hasegawa, Yasuhisa [2 ]
机构
[1] Nagoya Univ, Dept Micronano Mech Sci & Engn, Nagoya 4648603, Japan
[2] Nagoya Univ, Inst Innovat Future Soc, Nagoya 4648601, Japan
来源
IEEE ACCESS | 2024年 / 12卷
基金
日本科学技术振兴机构; 日本学术振兴会;
关键词
Kinematics; Jacobian matrices; Robots; Optimization; Iterative methods; End effectors; Manipulator dynamics; Sampling methods; Nonlinear systems; Robotic manipulators; inverse kinematics; variable step size; random sampling; nonlinear optimization; iterative methods; EFFICIENT GRADIENT-METHOD; STEEPEST DESCENT; MINIMIZATION; ALGORITHM;
D O I
10.1109/ACCESS.2024.3418206
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This study evaluates the impact of step size selection on Jacobian-based inverse kinematics (IK) for robotic manipulators. Although traditional constant step size approaches offer simplicity, they often exhibit limitations in convergence speed and performance. To address these challenges, we propose and evaluate novel variable step size strategies. Our work explores three approaches: gradient-based dynamic selection, cyclic alternation, and random sampling techniques. We conducted extensive experiments on various manipulator kinematic chains and IK algorithms to demonstrate the benefits of these approaches. In particular, variable step sizes randomly derived from a normal distribution consistently improve solve rates across all evaluated cases compared to constant step sizes. Incorporating random restarts further enhances performance, effectively mitigating the effect of local minima. Our results suggest that variable step size strategies can improve the performance of Jacobian-based IK methods for robotic manipulators and have potential applications in other nonlinear optimization problems.
引用
收藏
页码:87909 / 87922
页数:14
相关论文
共 56 条
[1]  
Altschuler JM, 2023, Arxiv, DOI arXiv:2309.07879
[2]   Convex Optimization: Algorithms and Complexity [J].
不详 .
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, 2015, 8 (3-4) :232-+
[3]  
[Anonymous], 1983, Numerical methods for unconstrained optimization and nonlinear equations
[5]   2-POINT STEP SIZE GRADIENT METHODS [J].
BARZILAI, J ;
BORWEIN, JM .
IMA JOURNAL OF NUMERICAL ANALYSIS, 1988, 8 (01) :141-148
[6]  
Beeson P, 2015, IEEE-RAS INT C HUMAN, P928, DOI 10.1109/HUMANOIDS.2015.7363472
[7]  
Berahas AS, 2021, Arxiv, DOI arXiv:1905.01332
[8]  
Bernstein J, 2023, Arxiv, DOI arXiv:2304.05187
[9]   Learning with Random Learning Rates [J].
Blier, Leonard ;
Wolinski, Pierre ;
Ollivier, Yann .
MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2019, PT II, 2020, 11907 :449-464
[10]  
Buss S. R., 2005, Journal of Graphics Tools, V10, P37