Three-dimensional Object Recognition via Subspace Representation on a Grassmann Manifold

被引:2
作者
Yataka, Ryoma [1 ]
Fukui, Kazuhiro [1 ]
机构
[1] Univ Tsukuba, Grad Sch Syst & Informat Engn, 1-1-1 Tennodai, Tsukuba, Ibaraki 3058573, Japan
来源
ICPRAM: PROCEEDINGS OF THE 6TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION APPLICATIONS AND METHODS | 2017年
关键词
Three-dimensional Object Recognition; Subspace Representation; Canonical Angles; Grassmann Manifold; Mutual Subspace Method; FACTORIZATION METHOD; SHAPE;
D O I
10.5220/0006204702080216
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose a method for recognizing three-dimensional (3D) objects using multi-view depth images. To derive the essential 3D shape information extracted from these images for stable and accurate 3D object recognition, we need to consider how to integrate partial shapes of a 3D object. To address this issue, we introduce two ideas. The first idea is to represent a partial shape of the 3D object by a three-dimensional subspace in a high-dimensional vector space. The second idea is to represent a set of the shape subspaces as a subspace on a Grassmann manifold, which reflects the 3D shape of the object more completely. Further, we measure the similarity between two subspaces on the Grassmann manifold by using the canonical angles between them. This measurement enables us to construct a more stable and accurate method based on richer information about the 3D shape. We refer to this method based on subspaces on a Grassmann manifold as the Grassmann mutual subspace method (GMSM). To further enhance the performance of the GMSM, we equip it with powerful feature-extraction capabilities. The validity of the proposed method is demonstrated through experimental comparisons with several conventional methods on a hand-depth image dataset.
引用
收藏
页码:208 / 216
页数:9
相关论文
共 50 条
[31]   Digital three-dimensional object reconstruction and correlation based on integral imaging [J].
Frauel, Y ;
Javidi, B .
STEREOSCOPIC DISPLAYS AND VIRTUAL REALITY SYSTEMS X, 2003, 5006 :83-91
[32]   Perception of object motion in three-dimensional space induced by cast shadows [J].
Katsuyama, Narumi ;
Usui, Nobuo ;
Nose, Izuru ;
Taira, Masato .
NEUROIMAGE, 2011, 54 (01) :485-494
[33]   Evaluation of 4 three-dimensional representation algorithms in capsule endoscopy images [J].
Karargyris, Alexandros ;
Rondonotti, Emanuele ;
Mandelli, Giovanna ;
Koulaouzidis, Anastasios .
WORLD JOURNAL OF GASTROENTEROLOGY, 2013, 19 (44) :8028-8033
[34]   Optical recognition of three-dimensional objects with in-plane rotation invariance [J].
Esteve-Taboada, JJ ;
García, J ;
Ferreira, C .
ROMOPTO 2000: SIXTH CONFERENCE ON OPTICS, 2000, 4430 :776-783
[35]   Three-dimensional object reconstruction of symmetric objects by fusing visual and tactile sensing [J].
Ilonen, Jarmo ;
Bohg, Jeannette ;
Kyrki, Ville .
INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2014, 33 (02) :321-341
[36]   Three-Dimensional Extended Object Tracking and Shape Learning Using Gaussian Processes [J].
Kumru, Murat ;
Ozkan, Emre .
IEEE TRANSACTIONS ON AEROSPACE AND ELECTRONIC SYSTEMS, 2021, 57 (05) :2795-2814
[37]   Perceptual influence of elementary three-dimensional geometry: (2) fundamental object parts [J].
Tamosiunaite, Minija ;
Sutterluetti, RahelM. ;
Stein, Simon C. ;
Woergoetter, Florentin .
FRONTIERS IN PSYCHOLOGY, 2015, 6
[38]   Systems in Development: Motor Skill Acquisition Facilitates Three-Dimensional Object Completion [J].
Soska, Kasey C. ;
Adolph, Karen E. ;
Johnson, Scott P. .
DEVELOPMENTAL PSYCHOLOGY, 2010, 46 (01) :129-138
[39]   Three-dimensional computational holographic imaging and recognition using independent component analysis [J].
Do, Cuong Manh ;
Javidi, Bahram .
PROCEEDINGS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES, 2008, 464 (2090) :409-422
[40]   Three-dimensional coupled-object segmentation using symmetry and tissue type information [J].
Bijari, Payam B. ;
Akhondi-Asl, Alireza ;
Soltanian-Zadeh, Hamid .
COMPUTERIZED MEDICAL IMAGING AND GRAPHICS, 2010, 34 (03) :236-249