Multi-Modal and Adaptive Robot Control through Hierarchical Quadratic Programming

被引:0
|
作者
Tassi, Francesco [1 ]
Ajoudani, Arash [1 ]
机构
[1] Ist Italiano Tecnol, Human Robot Interfaces & Interact Lab, Genoa, Italy
基金
欧洲研究理事会;
关键词
Hierarchical optimal control; Adaptive compliance; Human-robot collaboration; Variable impedance control; Force control; MULTIROBOT; MOTION; BEHAVIOR;
D O I
10.1007/s10846-024-02193-1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper proposes a novel Hierarchical Quadratic Programming (HQP)-based framework that enables multi-tasking control under multiple Human-Robot Interaction (HRI) scenarios. The proposed controllers' formulations are inspired by real-world contact-rich scenarios, which currently constitute one of the main limitations in terms of widespread practical deployment. Indeed, HRI can occur through different modalities, based on human's needs. The objective is to create a unique framework for various types of possible interactions, avoiding the necessity of switching between different control architectures, which requires dealing with discontinuities. To achieve this, we firstly propose a HQP-based hybrid Cartesian/joint space impedance control formulation. Based on the robot's dynamics, this controller enables an adaptive compliance behaviour, while achieving hierarchical motion control. This is validated through a series of experiments that show the accuracy of trajectory tracking, which remains in the order of 10mm during fast motions thanks to the addition of the robot dynamics. Besides, the hybrid compliance behaviour allows to deviate from such accuracy when an interaction is present. We then consider the case in which the human needs to move the robot directly, by proposing a hybrid admittance/impedance controller, that is again based on a HQP formulation and provides inherent softening when conflicting tasks are present, or in close-to-limit and near-singular configurationsa. This is validated through several experiments in which the human easily moves the robot in the workspace via direct physical interaction. Next, we formulate an additional hierarchy that enables force control and allows to maintain a specific interaction force at the end effector. We then extend this to simultaneous force and trajectory tracking. Overall, we obtain a multi-purpose HQP-based control framework, that seamlessly switchwes between interaction modes, enabling multiple hierarchical behaviours, and covering a wide spectrum of interaction types, essential for synergistic HRI.
引用
收藏
页数:17
相关论文
共 50 条
  • [11] An Introduction to the Multi-Modal Multi-Robot (MuMoMuRo) Control System
    Tse, Jason T. P.
    Chan, Stephen C. F.
    Ngai, Grace
    IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC 2010), 2010,
  • [12] Task Parametrization through Multi-modal Analysis of Robot Experiences
    Winkler, Jan
    Bozcuoglu, Asil Kaan
    Pomarlan, Mihai
    Beetz, Michael
    AAMAS'17: PROCEEDINGS OF THE 16TH INTERNATIONAL CONFERENCE ON AUTONOMOUS AGENTS AND MULTIAGENT SYSTEMS, 2017, : 1754 - 1756
  • [13] Hierarchical Adaptive Value Estimation for Multi-modal Visual Reinforcement Learning
    Huang, Yangru
    Peng, Peixi
    Zhao, Yifan
    Xu, Haoran
    Geng, Mengyue
    Tian, Yonghong
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [14] Hierarchical Multi-Modal Prompting Transformer for Multi-Modal Long Document Classification
    Liu, Tengfei
    Hu, Yongli
    Gao, Junbin
    Sun, Yanfeng
    Yin, Baocai
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (07) : 6376 - 6390
  • [15] The Design and Control of the Multi-Modal Locomotion Origami Robot, Tribot
    Zhakypov, Zhenishbek
    Falahi, Mohsen
    Shah, Manan
    Paik, Jamie
    2015 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2015, : 4349 - 4355
  • [16] SITUATION ASSESSMENT THROUGH MULTI-MODAL SENSING OF DYNAMIC ENVIRONMENTS TO SUPPORT COGNITIVE ROBOT CONTROL
    Badii, Atta
    Khan, Ali
    Raval, Rajkumar
    Oudi, Hamid
    Ayora, Ricardo
    Khan, Wasiq
    Jaidi, Amine
    Viswanathan, Nagarajan
    FACTA UNIVERSITATIS-SERIES MECHANICAL ENGINEERING, 2014, 12 (03) : 251 - 260
  • [17] Multi-modal Controls of A Smart Robot
    Mishra, Anurag
    Makula, Pooja
    Kumar, Akshay
    Karan, Krit
    Mittal, V. K.
    2015 ANNUAL IEEE INDIA CONFERENCE (INDICON), 2015,
  • [18] Multi-modal long document classification based on Hierarchical Prompt and Multi-modal Transformer
    Liu, Tengfei
    Hu, Yongli
    Gao, Junbin
    Wang, Jiapu
    Sun, Yanfeng
    Yin, Baocai
    NEURAL NETWORKS, 2024, 176
  • [19] Multi-modal User Interface for Multi-robot Control in Underground Environments*
    Chen, Shengkang
    O'Brien, Matthew J.
    Talbot, Fletcher
    Williams, Jason
    Tidd, Brendan
    Pitt, Alex
    Arkin, Ronald C.
    2022 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2022, : 9995 - 10002
  • [20] Design of Robot Teaching Assistants Through Multi-modal Human-Robot Interactions
    Ferrarelli, Paola
    Lazaro, Maria T.
    Iocchi, Luca
    ROBOTICS IN EDUCATION: LATEST RESULTS AND DEVELOPMENTS, 2018, 630 : 274 - 286