Neural Contact Fields: Tracking Extrinsic Contact with Tactile Sensing

被引:8
作者
Higuera, Carolina [1 ]
Dong, Siyuan [1 ]
Boots, Byron [1 ]
Mukadam, Mustafa [2 ]
机构
[1] Univ Washington, Seattle, WA 98195 USA
[2] Meta AI, New York, NY USA
来源
2023 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2023) | 2023年
关键词
D O I
10.1109/ICRA48891.2023.10160526
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We present Neural Contact Fields, a method that brings together neural fields and tactile sensing to address the problem of tracking extrinsic contact between object and environment. Knowing where the external contact occurs is a first step towards methods that can actively control it in facilitating downstream manipulation tasks. Prior work for localizing environmental contacts typically assume a contact type (e.g. point or line), does not capture contact/no-contact transitions, and only works with basic geometric-shaped objects. Neural Contact Fields are the first method that can track arbitrary multi-modal extrinsic contacts without making any assumptions about the contact type. Our key insight is to estimate the probability of contact for any 3D point in the latent space of object's shapes, given vision-based tactile inputs that sense the local motion resulting from the external contact. In experiments, we find that Neural Contact Fields are able to localize multiple contact patches without making any assumptions about the geometry of the contact, and capture contact/no-contact transitions for known categories of objects with unseen shapes in unseen environment configurations. In addition to Neural Contact Fields, we also release our YCB-Extrinsic-Contact dataset of simulated extrinsic contact interactions to enable further research in this area. Project page: https://github.com/carolinahiguera/NCF
引用
收藏
页码:12576 / 12582
页数:7
相关论文
共 36 条
[1]  
Bauza M., 2022, TAC2POSE TACTILE OBJ
[2]   Deep Local Shapes: Learning Local SDF Priors for Detailed 3D Reconstruction [J].
Chabra, Rohan ;
Lenssen, Jan E. ;
Ilg, Eddy ;
Schmidt, Tanner ;
Straub, Julian ;
Lovegrove, Steven ;
Newcombe, Richard .
COMPUTER VISION - ECCV 2020, PT XXIX, 2020, 12374 :608-625
[3]   Fully body visual self-modeling of robot morphologies [J].
Chen, Boyuan ;
Kwiatkowski, Robert ;
Vondrick, Carl ;
Lipson, Hod .
SCIENCE ROBOTICS, 2022, 7 (68)
[4]  
Coumans E., Pybullet, a python module for physics simulation for games, robotics and machine learning
[5]  
Dafle NC, 2014, IEEE INT CONF ROBOT, P1578, DOI 10.1109/ICRA.2014.6907062
[6]   Shape Completion using 3D-Encoder-Predictor CNNs and Shape Synthesis [J].
Dai, Angela ;
Qi, Charles Ruizhongtai ;
Niessner, Matthias .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :6545-6554
[7]   Object Rearrangement Using Learned Implicit Collision Functions [J].
Danielczuk, Michael ;
Mousavian, Arsalan ;
Eppner, Clemens ;
Fox, Dieter .
2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2021), 2021, :6010-6017
[8]  
de Vries H, 2017, ADV NEUR IN, V30
[9]   Tactile-RL for Insertion: Generalization to Objects of Unknown Geometry [J].
Dong, Siyuan ;
Jha, Devesh K. ;
Romeres, Diego ;
Kim, Sangwoon ;
Nikovski, Daniel ;
Rodriguez, Alberto .
2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2021), 2021, :6437-6443
[10]  
Dong SY, 2019, IEEE INT C INT ROBOT, P7953, DOI [10.1109/IROS40897.2019.8968204, 10.1109/iros40897.2019.8968204]