A new formation of supervised dimensionality reduction method for moving vehicle classification

被引:2
作者
Chandrasekar, K. Silpaja [1 ]
Geetha, P. [1 ]
机构
[1] Anna Univ, Dept Comp Sci & Engn, Chennai, Tamil Nadu, India
关键词
Linear discriminant analysis; Diagonal eigenvalues; Linear matrix expansion; Network linear outlier factor; Fuzzy random forest classifier; PRINCIPAL-COMPONENT ANALYSIS; MAXIMUM CORRENTROPY; ALGORITHMS; FRAMEWORK; IMAGES;
D O I
10.1007/s00521-020-05524-z
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Analyzing a large number of features set for the classification process entails cost and complexity. To reduce this burden, dimensionality reduction has been applied to the extracted set of features as a preprocessing step. Among dimensionality reduction algorithms, many methods fail to handle high-dimensional data and they increase information loss and are sensitive to outliers. Therefore, this research proposes a new supervised dimensionality reduction method developed using an improved formation of linear discriminant analysis with diagonal eigenvalues (LDA-DE) that simultaneously preserves the information and addresses the issues of the classification process. The proposed framework focuses on reducing the dimension of extracted features set by computing the scattered matrices from the class labels and the diagonal eigenvalue matrix. Methods to eliminate duplicate rows and columns, to avoid feature overwriting, and to remove outliers are included in the newly developed LDA-DE method. The new LDA-DE method implemented with a fuzzy random forest classifier is tested on two datasets-MIO-TCD and BIT-Vehicle-to classify the moving vehicles. The performance of our LDA-DE method is compared with five state-of-the-art dimensionality reduction methods. The experimental confusion matrix results show that the LDA-DE method generates the reduced feature vector of the objects to a maximum extent. Further, the newly developed LDA-DE method achieves the best reduction results with optimal performance parameter values (lowest mean and standard deviation and highest f-measure and accuracy) and minimal data processing time than the state-of-the-art methods, promising its application for a fast and effective dimensionality reduction for moving vehicle classification.
引用
收藏
页码:7839 / 7850
页数:12
相关论文
共 66 条
[1]   Face recognition using supervised probabilistic principal component analysis mixture model in dimensionality reduction without loss framework [J].
Ahmadkhani, Somaye ;
Adibi, Peyman .
IET COMPUTER VISION, 2016, 10 (03) :193-201
[2]   Principal Component Analysis of High-Frequency Data [J].
Ait-Sahalia, Yacine ;
Xiu, Dacheng .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2019, 114 (525) :287-303
[3]  
[Anonymous], 2013, Multilinear subspace learning: dimensionality reduction of multidimensional data
[4]  
Avron H., 2013, INT C MACHINE LEARNI, P347
[5]   A General Framework for Dimensionality-Reducing Data Visualization Mapping [J].
Bunte, Kerstin ;
Biehl, Michael ;
Hammer, Barbara .
NEURAL COMPUTATION, 2012, 24 (03) :771-804
[6]   Recent Advances in Supervised Dimension Reduction: A Survey [J].
Chao, Guoqing ;
Luo, Yuan ;
Ding, Weiping .
MACHINE LEARNING AND KNOWLEDGE EXTRACTION, 2019, 1 (01) :341-358
[7]   Generalized Correntropy for Robust Adaptive Filtering [J].
Chen, Badong ;
Xing, Lei ;
Zhao, Haiquan ;
Zheng, Nanning ;
Principe, Jose C. .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2016, 64 (13) :3376-3387
[8]   Iterative K - Nearest Neighbors Algorithm (IKNN) for submeter spatial resolution image classification obtained by Unmanned Aerial Vehicle (UAV) [J].
Chimelo Ruiz, Luis Fernando ;
Guasselli, Laurindo Antonio ;
ten Caten, Alexandre ;
Zanotta, Daniel Capella .
INTERNATIONAL JOURNAL OF REMOTE SENSING, 2018, 39 (15-16) :5043-5058
[9]   Unsupervised dimensionality reduction versus supervised regularization for classification from sparse data [J].
Clark, Jessica ;
Provost, Foster .
DATA MINING AND KNOWLEDGE DISCOVERY, 2019, 33 (04) :871-916
[10]  
Comon P, 2010, HANDBOOK OF BLIND SOURCE SEPARATION: INDEPENDENT COMPONENT ANALYSIS AND APPLICATIONS, P1