Classification and Reconstruction From Random Projections for Hyperspectral Imagery

被引:30
作者
Li, Wei [1 ]
Prasad, Saurabh [2 ]
Fowler, James E. [3 ]
机构
[1] Univ Calif Davis, Ctr Spatial Technol & Remote Sensing, Davis, CA 95616 USA
[2] Univ Houston, Dept Elect & Comp Engn, Houston, TX 77204 USA
[3] Mississippi State Univ, Dept Elect & Comp Engn, Starkville, MS 39762 USA
来源
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING | 2013年 / 51卷 / 02期
基金
美国国家科学基金会;
关键词
Dimensionality reduction; Gaussian mixture model (GMM); hyperspectral data; random projection; support vector machine (SVM); NONLINEAR DIMENSIONALITY REDUCTION; MODEL; LINDENSTRAUSS; SEGMENTATION; JOHNSON;
D O I
10.1109/TGRS.2012.2204759
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
There is increasing interest in dimensionality reduction through random projections due in part to the emerging paradigm of compressed sensing. It is anticipated that signal acquisition with random projections will decrease signal-sensing costs significantly; moreover, it has been demonstrated that both supervised and unsupervised statistical learning algorithms work reliably within randomly projected subspaces. Capitalizing on this latter development, several class-dependent strategies are proposed for the reconstruction of hyperspectral imagery from random projections. In this approach, each hyperspectral pixel is first classified into one of several pixel groups using either a conventional supervised classifier or an unsupervised clustering algorithm. After the grouping procedure, a suitable reconstruction method, such as compressive projection principal component analysis, is employed independently within each group. Experimental results confirm that such class-dependent reconstruction, which employs statistics pertinent to each class as opposed to the global statistics estimated over the entire data set, results in more accurate reconstructions of hyperspectral pixels from random projections.
引用
收藏
页码:833 / 843
页数:11
相关论文
共 38 条
[1]   Database-friendly random projections: Johnson-Lindenstrauss with binary coins [J].
Achlioptas, D .
JOURNAL OF COMPUTER AND SYSTEM SCIENCES, 2003, 66 (04) :671-687
[2]  
[Anonymous], 2004, ADV NEURAL INFORM PR
[3]  
[Anonymous], P IS T SPIE S EL IM
[4]  
Boutsidis C., 2010, ADV NEURAL INFORM PR, V23, P298
[5]  
Calderbank R., 2009, Compressive learning: Universal Sparse Dimensionality in the Measurement Domain
[6]  
Candès EJ, 2008, IEEE SIGNAL PROC MAG, V25, P21, DOI 10.1109/MSP.2007.914731
[7]   Near-optimal signal recovery from random projections: Universal encoding strategies? [J].
Candes, Emmanuel J. ;
Tao, Terence .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (12) :5406-5425
[8]   Atomic decomposition by basis pursuit [J].
Chen, SSB ;
Donoho, DL ;
Saunders, MA .
SIAM JOURNAL ON SCIENTIFIC COMPUTING, 1998, 20 (01) :33-61
[9]  
COMBETTES PL, 1993, P IEEE, V81, P182, DOI 10.1109/5.214546
[10]   An elementary proof of a theorem of Johnson and Lindenstrauss [J].
Dasgupta, S ;
Gupta, A .
RANDOM STRUCTURES & ALGORITHMS, 2003, 22 (01) :60-65