Making Industrial Robots Smarter with Adaptive Reasoning and Autonomous Thinking for Real-Time Tasks in Dynamic Environments: A Case Study

被引:1
作者
Zabalza, Jaime [1 ]
Fei, Zixiang [2 ]
Wong, Cuebong [2 ]
Yan, Yijun [1 ]
Mineo, Carmelo [1 ]
Yang, Erfu [2 ]
Rodden, Tony [3 ]
Mehnen, Jorn [2 ]
Pham, Quang-Cuong [4 ]
Ren, Jinchang [1 ]
机构
[1] Univ Strathclyde, Dept Elect & Elect Engn, Glasgow G1 1XJ, Scotland
[2] Univ Strathclyde, Dept Design Mfg & Engn Management, Glasgow G1 1XJ, Scotland
[3] Univ Strathclyde, Adv Forming Res Ctr, Glasgow G1 1XJ, Scotland
[4] Nanyang Technol Univ, Sch Mech & Aerosp Engn, 50 Nanyang Ave, Singapore 639798, Singapore
来源
ADVANCES IN BRAIN INSPIRED COGNITIVE SYSTEMS, BICS 2018 | 2018年 / 10989卷
基金
英国工程与自然科学研究理事会;
关键词
Machine vision; Path planning; Robot control; Adaptive reasoning; Dynamic environment; FRAMEWORK; MODEL;
D O I
10.1007/978-3-030-00563-4_77
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In order to extend the abilities of current robots in industrial applications towards more autonomous and flexible manufacturing, this work presents an integrated system comprising real-time sensing, path-planning and control of industrial robots to provide them with adaptive reasoning, autonomous thinking and environment interaction under dynamic and challenging conditions. The developed system consists of an intelligent motion planner for a 6 degrees-of-freedom robotic manipulator, which performs pick-and-place tasks according to an optimized path computed in real-time while avoiding a moving obstacle in the workspace. This moving obstacle is tracked by a sensing strategy based on machine vision, working on the HSV space for color detection in order to deal with changing conditions including non-uniform background, lighting reflections and shadows projection. The proposed machine vision is implemented by an off-board scheme with two low-cost cameras, where the second camera is aimed at solving the problem of vision obstruction when the robot invades the field of view of the main sensor. Real-time performance of the overall system has been experimentally tested, using a KUKA KR90 R3100 robot.
引用
收藏
页码:790 / 800
页数:11
相关论文
共 27 条
[1]  
Abu P.A., 2014, HUMANOID NANOTECHNOL, P1
[2]  
Ajwad SA, 2014, J BALK TRIBOL ASSOC, V20, P499
[3]  
Anand G, 2017, 2017 INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTING, INSTRUMENTATION AND CONTROL TECHNOLOGIES (ICICICT), P715, DOI 10.1109/ICICICT1.2017.8342652
[4]  
[Anonymous], 2013, KR QUANTEC EXTRAHA S
[5]  
Chella A, 2010, ADV EXP MED BIOL, V657, P267, DOI 10.1007/978-0-387-79100-5_15
[6]   Collaborative manufacturing with physical human-robot interaction [J].
Cherubini, Andrea ;
Passama, Robin ;
Crosnier, Andre ;
Lasnier, Antoine ;
Fraisse, Philippe .
ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING, 2016, 40 :1-13
[7]  
deBoor C., 1978, Applied Mathematical Sciences, Vfirst, DOI DOI 10.1007/978-1-4612-6333-3
[8]  
Donahoo MJ, 2009, MORG KAUF SER INTER, P1
[9]   Effective venue image retrieval using robust feature extraction and model constrained matching for mobile robot localization [J].
Feng, Yue ;
Ren, Jinchang ;
Jiang, Jianmin ;
Halvey, Martin ;
Jose, Joemon M. .
MACHINE VISION AND APPLICATIONS, 2012, 23 (05) :1011-1027
[10]   Object Detection in Optical Remote Sensing Images Based on Weakly Supervised Learning and High-Level Feature Learning [J].
Han, Junwei ;
Zhang, Dingwen ;
Cheng, Gong ;
Guo, Lei ;
Ren, Jinchang .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2015, 53 (06) :3325-3337