Dimensionality Reduction via Regression in Hyperspectral Imagery

被引:35
作者
Laparra, Valero [1 ]
Malo, Jesus [1 ]
Camps-Valls, Gustau [1 ]
机构
[1] Univ Valencia, IPL, Valencia 46980, Spain
关键词
Dimensionality reduction via regression; hyperspectral sounder; Infrared Atmospheric Sounding Interferometer (IASI); landsat; manifold learning; nonlinear dimensionality reduction; principal component analysis (PCA); PRINCIPAL CURVES; ATMOSPHERIC PROFILES; IASI; RETRIEVAL;
D O I
10.1109/JSTSP.2015.2417833
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
This paper introduces a new unsupervised method for dimensionality reduction via regression (DRR). The algorithm belongs to the family of invertible transforms that generalize principal component analysis (PCA) by using curvilinear instead of linear features. DRR identifies the nonlinear features through multivariate regression to ensure the reduction in redundancy between the PCA coefficients, the reduction of the variance of the scores, and the reduction in the reconstruction error. More importantly, unlike other nonlinear dimensionality reduction methods, the invertibility, volume-preservation, and straightforward out-of-sample extension, makes DRR interpretable and easy to apply. The properties of DRR enable learning a more broader class of data manifolds than the recently proposed non-linear principal components analysis (NLPCA) and principal polynomial analysis (PPA). We illustrate the performance of the representation in reducing the dimensionality of remote sensing data. In particular, we tackle two common problems: processing very high dimensional spectral information such as in hyperspectral image sounding data, and dealing with spatial-spectral image patches of multispectral images. Both settings pose collinearity and ill-determination problems. Evaluation of the expressive power of the features is assessed in terms of truncation error, estimating atmospheric variables, and surface land cover classification error. Results show that DRR outperforms linear PCA and recently proposed invertible extensions based on neural networks (NLPCA) and univariate regressions (PPA).
引用
收藏
页码:1026 / 1036
页数:11
相关论文
共 62 条
  • [1] [Anonymous], 2002, Series: Springer Series in Statistics
  • [2] [Anonymous], 2004, KERNEL METHODS PATTE
  • [3] [Anonymous], 2011, SYNTH LECT IMAGE VID
  • [4] [Anonymous], 2013, C LEARNING THEORY
  • [5] [Anonymous], IEEE INT WORKSH MACH
  • [6] [Anonymous], 2003, Advances in Neural Information Processing Systems
  • [7] [Anonymous], 2003, Advances in Neural Information Processing Systems 15, DOI DOI 10.1109/34.682189
  • [8] [Anonymous], 2002, Least Squares Support Vector Machines, DOI DOI 10.1142/5089
  • [9] Kernel Multivariate Analysis Framework for Supervised Subspace Learning
    Arenas-Garcia, Jeronimo
    Petersen, Kaare Brandt
    Camps-Valls, Gustavo
    Hansen, Lars Kai
    [J]. IEEE SIGNAL PROCESSING MAGAZINE, 2013, 30 (04) : 16 - 29
  • [10] Improved manifold coordinate representations of large-scale hyperspectral scenes
    Bachmann, Charles M.
    Ainsworth, Thomas L.
    Fusina, Robert A.
    [J]. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2006, 44 (10): : 2786 - 2803