Gibson Env: Real-World Perception for Embodied Agents

被引:428
作者
Xia, Fei [1 ]
Zamir, Amir R. [1 ,2 ]
He, Zhiyang [1 ]
Sax, Alexander [1 ]
Malik, Jitendra [2 ]
Savarese, Silvio [1 ]
机构
[1] Stanford Univ, Stanford, CA 94305 USA
[2] Univ Calif Berkeley, Berkeley, CA 94720 USA
来源
2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR) | 2018年
关键词
D O I
10.1109/CVPR.2018.00945
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Developing visual perception models for active agents and sensorimotor control in the physical world are cumbersome as existing algorithms are too slow to efficiently learn in real-time and robots are fragile and costly. This has given rise to learning-in-simulation which consequently casts a question on whether the results transfer to real-world. In this paper, we investigate developing real-world perception for active agents, propose Gibson Environment 1 for this purpose, and showcase a set of perceptual tasks learned therein. Gibson is based upon virtualizing real spaces, rather than artificially designed ones, and currently includes over 1400 floor spaces from 572 full buildings. The main characteristics of Gibson are: I. being from the real-world and reflecting its semantic complexity, II. having an internal synthesis mechanism "Goggles" enabling deploying the trained models in real-world without needing domain adaptation, III. embodiment of agents and making them subject to constraints of physics and space.
引用
收藏
页码:9068 / 9079
页数:12
相关论文
共 96 条
[1]  
Abbeel P., 2007, Advances in Neural Information Processing Systems, V19, P1
[2]   Autonomous Helicopter Aerobatics through Apprenticeship Learning [J].
Abbeel, Pieter ;
Coates, Adam ;
Ng, Andrew Y. .
INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2010, 29 (13) :1608-1639
[3]  
[Anonymous], ARXIV161104201
[4]  
[Anonymous], 2017, IEEE T PATTERN ANAL
[5]  
[Anonymous], 2016, Ijcai
[6]  
[Anonymous], 2016, SYNTHIA DATASET LARG
[7]  
[Anonymous], 2017, IEEE INT C COMP VIS
[8]  
[Anonymous], 2006, P 2006 C EMPIRICAL M, DOI DOI 10.3115/1610075.1610094
[9]  
[Anonymous], 2016, ARXIV161101779
[10]  
[Anonymous], 2018, ARXIV180200265