Locality-Preserving Discriminant Analysis in Kernel-Induced Feature Spaces for Hyperspectral Image Classification

被引:103
作者
Li, Wei [1 ,2 ]
Prasad, Saurabh [1 ,2 ]
Fowler, James E. [1 ,2 ]
Bruce, Lori Mann [1 ,2 ]
机构
[1] Mississippi State Univ, Geosyst Res Inst, Mississippi State, MS 39762 USA
[2] Mississippi State Univ, Dept Elect & Comp Engn, Mississippi State, MS 39762 USA
基金
美国国家科学基金会;
关键词
Dimensionality reduction; feature space; hyperspectral imagery (HSI); kernel methods; DIMENSIONALITY REDUCTION; SELECTION; FUSION;
D O I
10.1109/LGRS.2011.2128854
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
Linear discriminant analysis (LDA) has been widely applied for hyperspectral image (HSI) analysis as a popular method for feature extraction and dimensionality reduction. Linear methods such as LDA work well for unimodal Gaussian class-conditional distributions. However, when data samples between classes are nonlinearly separated in the input space, linear methods such as LDA are expected to fail. The kernel discriminant analysis (KDA) attempts to address this issue by mapping data in the input space onto a subspace such that Fisher's ratio in an intermediate (higher-dimensional) kernel-induced space is maximized. In recent studies with HSI data, KDA has been shown to outperform LDA, particularly when the data distributions are non-Gaussian and multimodal, such as when pixels represent target classes severely mixed with background classes. In this letter, a modified KDA algorithm, i.e., kernel local Fisher discriminant analysis (KLFDA), is studied for HSI analysis. Unlike KDA, KLFDA imposes an additional constraint on the mapping-it ensures that neighboring points in the input space stay close-by in the projected subspace and vice versa. Classification experiments with a challenging HSI task demonstrate that this approach outperforms current state-of-the-art HSI-classification methods.
引用
收藏
页码:894 / 898
页数:5
相关论文
共 16 条
  • [1] [Anonymous], 2003, P ADV NEUR INF PROC
  • [2] [Anonymous], AVIRIS NW Indiana's Indian Pines 1992 Data Set
  • [3] [Anonymous], 2000, Pattern Classification
  • [4] Feature selection and classification of hyperspectral images, with support vector machines
    Archibald, Rick
    Fann, George
    [J]. IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2007, 4 (04) : 674 - 677
  • [5] Classification of Hyperspectral Images With Regularized Linear Discriminant Analysis
    Bandos, Tatyana V.
    Bruzzone, Lorenzo
    Camps-Valls, Gustavo
    [J]. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2009, 47 (03): : 862 - 873
  • [6] Generalized discriminant analysis using a kernel approach
    Baudat, G
    Anouar, FE
    [J]. NEURAL COMPUTATION, 2000, 12 (10) : 2385 - 2404
  • [7] Kernel-based methods for hyperspectral image classification
    Camps-Valls, G
    Bruzzone, L
    [J]. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2005, 43 (06): : 1351 - 1362
  • [8] On the impact of PCA dimension reduction for hyperspectral detection of difficult targets
    Farrell, MD
    Mersereau, RM
    [J]. IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2005, 2 (02) : 192 - 195
  • [9] Mika S., 1999, Neural Networks for Signal Processing IX: Proceedings of the 1999 IEEE Signal Processing Society Workshop (Cat. No.98TH8468), P41, DOI 10.1109/NNSP.1999.788121
  • [10] Limitations of Principal Components Analysis for Hyperspectral Target Recognition
    Prasad, Saurabh
    Bruce, Lori Mann
    [J]. IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2008, 5 (04) : 625 - 629