Human-Robot Collaboration With Commonsense Reasoning in Smart Manufacturing Contexts

被引:26
作者
Conti, Christopher J. [1 ]
Varde, Aparna S. [1 ]
Wang, Weitian [1 ]
机构
[1] Montclair State Univ, Dept Comp Sci, Montclair, NJ 07043 USA
基金
美国国家科学基金会;
关键词
Robots; Task analysis; Collaboration; Commonsense reasoning; Smart manufacturing; Robot sensing systems; Planning; Collaborative robotics; commonsense knowledge; human-robot interaction; UN SDG 9; smart manufacturing; task quality optimization; KNOWLEDGE; FRAMEWORK;
D O I
10.1109/TASE.2022.3159595
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Human-robot collaboration (HRC), where humans and robots work together to handle specific tasks, requires designing robots that can effectively support human beings. Robots need to conduct reasoning using commonsense knowledge (CSK), e.g., fundamental knowledge that humans possess and use subconsciously, in order to assist humans in challenging and dynamic environments. Currently, there are several effective CSK systems used for organizing information and facts, along with detecting objects and determining their properties. HRC is employed in various manufacturing tasks, such as paint spraying and assembly, in order to keep humans safe while increasing efficiency. Although there is a large array of research on HRC and on CSK, there is minimal research linking the two concepts together. This paper presents a novel system on human-robot collaboration guided by commonsense reasoning for automation in manufacturing tasks. This fits within the general realm of smart manufacturing. The primary focus is on improving the efficacy of human-robot co-assembly tasks. Evaluations conducted with online simulations and real-world experiments indicate that reasoning using CSK-based robot priorities enhances HRC as compared to simpler robot priorities, e.g., merely handling nearby objects. This system is modifiable and can be used for larger and more complex real-world tasks, thereby leading to improved automation in manufacturing. This paper demonstrates the scope of combining HRC and CSK, while future works will be able to further utilize the benefits of combining the two fields with significant impacts.
引用
收藏
页码:1784 / 1797
页数:14
相关论文
共 42 条
[31]  
Suchanek F. M., 2006, P 16 INT C WORLD WID, P697
[32]   WebChild 2.0: Fine-Grained Commonsense Knowledge Distillation [J].
Tandon, Niket ;
de Melo, Gerard ;
Weikum, Gerhard .
PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017): SYSTEM DEMONSTRATIONS, 2017, :115-120
[33]   Commonsense Knowledge in Machine Intelligence [J].
Tandon, Niket ;
Varde, Aparna S. ;
de Melo, Gerard .
SIGMOD RECORD, 2017, 46 (04) :49-52
[34]   Data-driven smart manufacturing [J].
Tao, Fei ;
Qi, Qinglin ;
Liu, Ang ;
Kusiak, Andrew .
JOURNAL OF MANUFACTURING SYSTEMS, 2018, 48 :157-169
[35]  
Thoben K-D., 2017, Int J Autom Technol, V11, P4, DOI [DOI 10.20965/IJAT.2017.P0004, 10.20965/ijat.2017.p0004]
[36]   Human-robot interaction in agriculture: A survey and current challenges [J].
Vasconez, Juan P. ;
Kantor, George A. ;
Auat Cheein, Fernando A. .
BIOSYSTEMS ENGINEERING, 2019, 179 :35-48
[37]   Time-Efficient Uplink Data Collection for UAV-assisted NOMA networks [J].
Wang, Wei ;
Zhao, Nan ;
Chen, Li ;
Liu, Xin ;
Chen, Yunfei ;
Niyato, Dusit .
2021 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE (WCNC), 2021,
[38]   Predicting Human Intentions in Human-Robot Hand-Over Tasks Through Multimodal Learning [J].
Wang, Weitian ;
Li, Rui ;
Chen, Yi ;
Sun, Yi ;
Jia, Yunyi .
IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2022, 19 (03) :2339-2353
[39]   Facilitating Human-Robot Collaborative Tasks by Teaching-Learning-Collaboration From Human Demonstrations [J].
Wang, Weitian ;
Li, Rui ;
Chen, Yi ;
Diekel, Z. Max ;
Jia, Yunyi .
IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2019, 16 (02) :640-653
[40]   Controlling Object Hand-Over in Human-Robot Collaboration Via Natural Wearable Sensing [J].
Wang, Weitian ;
Li, Rui ;
Diekel, Zachary Max ;
Chen, Yi ;
Zhang, Zhujun ;
Jia, Yunyi .
IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS, 2019, 49 (01) :59-71