Dimensionality reduction based on ICA for regression problems

被引:0
作者
Kwak, Nojun
Kim, Chunghoon
机构
[1] Samsung Elect, Suwon 442742, Gyeonggi, South Korea
[2] Seoul Natl Univ, Sch Elect Engn & Comp Sci, Seoul 151744, South Korea
来源
ARTIFICIAL NEURAL NETWORKS - ICANN 2006, PT 1 | 2006年 / 4131卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In manipulating data such as in supervised learning, we often extract new features from the original features for the purpose of reducing the dimensions of feature space and achieving better performance. In this paper, we show how standard algorithms for independent component analysis (ICA) can be applied to extract features for regression problems. The advantage is that general ICA algorithms become available to a task of feature extraction for regression problems by maximizing the joint mutual information between target variable and new features. Using the new features, we can greatly reduce the dimension of feature space without degrading the regression performance.
引用
收藏
页码:1 / 10
页数:10
相关论文
共 5 条
[1]   AN INFORMATION MAXIMIZATION APPROACH TO BLIND SEPARATION AND BLIND DECONVOLUTION [J].
BELL, AJ ;
SEJNOWSKI, TJ .
NEURAL COMPUTATION, 1995, 7 (06) :1129-1159
[2]  
Fukunaga K., 1990, INTRO STAT PATTERN R
[3]  
Joliffe I.T., 1986, Principal Component Analysis
[4]   Feature extraction based on ICA for binary classification problems [J].
Kwak, N ;
Choi, CH .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2003, 15 (06) :1374-1388
[5]  
LEE TW, 1999, NEURAL COMPUTATION, V11