Learning an Efficient Texture Model by Supervised Nonlinear Dimensionality Reduction Methods

被引:0
作者
Barshan, Elnaz [1 ]
Behravan, Mina [1 ]
Azimifar, Zohreh [1 ]
机构
[1] Shiraz Univ, Sch Elect & Comp Engn, Shiraz, Iran
来源
PROGRESS IN PATTERN RECOGNITION, IMAGE ANALYSIS, COMPUTER VISION, AND APPLICATIONS, PROCEEDINGS | 2009年 / 5856卷
关键词
Texture Recognition; Texton; Dimensionality Reduction;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This work investigates the problem of texture recognition under varying lighting and viewing conditions. One of the most successful approaches for handling this problem is to focus on textons, describing local properties of textures. Leung and Malik [1] introduced the framework of this approach which was followed by other researchers who tried to address its limitations such as high dimensionality of textons and feature histograms as well as poor classification of a single image under known conditions. In this paper, we overcome the above-mentioned drawbacks by use of recently introduced supervised nonlinear dimensionality reduction methods. These methods provide us with an embedding which describes data instances from the same classes more closely to each other while separating data from different classes as much as possible. Here, we take advantage of the superiority of modified methods such as "Colored Maximum Variance Unfolding" as one of the most efficient heuristics for supervised dimensionality reduction. The CUReT (Columbia-Utrecht Reflectance and Texture) database is used for evaluation of the proposed method. Experimental results indicate that the algorithm we have put forward intelligibly outperforms the existing methods. In addition, we show that intrinsic dimensionality of data is much less than the number of measurements available for each item. In this manner, we can practically analyze high dimensional data and get the benefits of data visualization.
引用
收藏
页码:209 / 216
页数:8
相关论文
共 50 条
[21]   Learning Through Non-linearly Supervised Dimensionality Reduction [J].
Grabocka, Josif ;
Schmidt-Thieme, Lars .
TRANSACTIONS ON LARGE-SCALE DATA- AND KNOWLEDGE- CENTERED SYSTEMS XVII, 2015, 8970 :74-96
[22]   Bayesian Supervised Dimensionality Reduction [J].
Gonen, Mehmet .
IEEE TRANSACTIONS ON CYBERNETICS, 2013, 43 (06) :2179-2189
[23]   Efficient Dimensionality Reduction Methods in Reservoir History Matching [J].
Tadjer, Amine ;
Bratvold, Reider B. ;
Hanea, Remus G. .
ENERGIES, 2021, 14 (11)
[24]   Supervised Gaussian Process Latent Variable Model for Dimensionality Reduction [J].
Gao, Xinbo ;
Wang, Xiumei ;
Tao, Dacheng ;
Li, Xuelong .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2011, 41 (02) :425-434
[25]   Locality Constrained Dictionary Learning for Nonlinear Dimensionality Reduction [J].
Zhou, Yin ;
Barner, Kenneth E. .
IEEE SIGNAL PROCESSING LETTERS, 2013, 20 (04) :335-338
[26]   A Supervised Learning Method Combine With Dimensionality Reduction in Vietnamese Text Summarization [J].
Ha Nguyen Thi Thu ;
Quynh Nguyen Huu ;
Tu Nguyen Thi Ngoc .
2013 COMPUTING, COMMUNICATIONS AND IT APPLICATIONS CONFERENCE (COMCOMAP), 2013, :69-73
[27]   Entropic Semi-Supervised Dimensionality Reduction for Distance Metric Learning [J].
Levada, Alexandre L. M. .
INTERNATIONAL JOURNAL OF UNCERTAINTY FUZZINESS AND KNOWLEDGE-BASED SYSTEMS, 2025, 33 (02) :219-234
[28]   A unified semi-supervised dimensionality reduction framework for manifold learning [J].
Chatpatanasiri, Ratthachat ;
Kijsirikul, Boonserm .
NEUROCOMPUTING, 2010, 73 (10-12) :1631-1640
[29]   Kernel logistic PLS:: A tool for supervised nonlinear dimensionality reduction and binary classification [J].
Tenenhaus, Arthur ;
Giron, Alain ;
Viennet, Emmanuel ;
Bera, Michel ;
Saporta, Gilbert ;
Fertil, Bernard .
COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2007, 51 (09) :4083-4100
[30]   Effective Semi-supervised Nonlinear Dimensionality Reduction for Wood Defects Recognition [J].
Zhang, Zhao ;
Ye, Ning .
COMPUTER SCIENCE AND INFORMATION SYSTEMS, 2010, 7 (01) :127-138