A joint optimization framework to semi-supervised RVFL and ELM networks for efficient data classification

被引:33
作者
Peng, Yong [1 ,2 ]
Li, Qingxi [1 ]
Kong, Wanzeng [1 ]
Qin, Feiwei [1 ]
Zhang, Jianhai [1 ]
Cichocki, Andrzej [3 ]
机构
[1] Hangzhou Dianzi Univ, Sch Comp Sci & Technol, Hangzhou 310018, Peoples R China
[2] Anhui Polytech Univ, Key Lab Adv Percept & Intelligent Control High En, Minist Educ, Wuhu 241000, Peoples R China
[3] Skolkov Inst Sci & Technol, Ctr Computat & Data Intens Sci & Engn, Moscow 143026, Russia
基金
中国博士后科学基金;
关键词
Random vector functional link (RVFL); Extreme learning machine (ELM); Semi-supervised learning; Joint optimization; Electroencephalography (EEG); Emotion recognition; EXTREME LEARNING-MACHINE; DIFFERENTIAL ENTROPY FEATURE; RANDOMIZED ALGORITHMS; NEURAL-NETWORKS; MANIFOLD; CLASSIFIERS;
D O I
10.1016/j.asoc.2020.106756
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Due to the inefficiency of gradient-based iterative ways in network training, randomization-based neural networks usually offer non-iterative closed form solutions. The random vector functional link (RVFL) and extreme learning machine (ELM) are two popular randomized networks which provide us unified frameworks for both regression and multi-class classification. Currently, existing studies on RVFL and ELM focused mainly on supervised tasks even though we usually have only a small number of labeled samples but a large number of unlabeled samples. Therefore, it is necessary to make both models appropriately utilize both labeled and unlabeled samples; that is, we should develop their semi-supervised extensions. In this paper, we propose a joint optimization framework to semi-supervised RVFL and ELM networks. In the formulated JOSRVFL (jointly optimized semi-supervised RVFL) and JOSELM, the output weight matrix and the label indicator matrix of the unlabeled samples can be jointly optimized in an iterative manner. We provide a novel approach to optimize the JOSRVFL and JOSELM objective functions. Extensive experiments on benchmark data sets and Electroencephalography-based emotion recognition tasks showed the excellent performance of the proposed JOSRVFL and JOSELM models. Moreover, because the direct input-output connections help to regularize the randomization, JOSRVFL could obtain superior performance to JOSELM in most cases. (C) 2020 Elsevier B.V. All rights reserved.
引用
收藏
页数:15
相关论文
共 58 条
[1]  
[Anonymous], 2019, RANDOM VECTOR FUNCTI
[2]  
[Anonymous], 2003, INT C MACH LEARN ICM
[3]  
Belkin M, 2006, J MACH LEARN RES, V7, P2399
[4]   Graph Regularized Nonnegative Matrix Factorization for Data Representation [J].
Cai, Deng ;
He, Xiaofei ;
Han, Jiawei ;
Huang, Thomas S. .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2011, 33 (08) :1548-1560
[5]   A review on neural networks with random weights [J].
Cao, Weipeng ;
Wang, Xizhao ;
Ming, Zhong ;
Gao, Jinzhu .
NEUROCOMPUTING, 2018, 275 :278-287
[6]   Domain Space Transfer Extreme Learning Machine for Domain Adaptation [J].
Chen, Yiming ;
Song, Shiji ;
Li, Shuang ;
Yang, Le ;
Wu, Cheng .
IEEE TRANSACTIONS ON CYBERNETICS, 2019, 49 (05) :1909-1922
[7]   Globality-Locality Preserving Maximum Variance Extreme Learning Machine [J].
Chu, Yonghe ;
Lin, Hongfei ;
Yang, Liang ;
Diao, Yufeng ;
Zhang, Dongyu ;
Zhang, Shaowu ;
Fan, Xiaochao ;
Shen, Chen ;
Yan, Deqin .
COMPLEXITY, 2019, 2019
[8]   MATRIX MULTIPLICATION VIA ARITHMETIC PROGRESSIONS [J].
COPPERSMITH, D ;
WINOGRAD, S .
JOURNAL OF SYMBOLIC COMPUTATION, 1990, 9 (03) :251-280
[9]   Multilayer one-class extreme learning machine [J].
Dai, Haozhen ;
Cao, Jiuwen ;
Wang, Tianlei ;
Deng, Muqing ;
Yang, Zhixin .
NEURAL NETWORKS, 2019, 115 (11-22) :11-22
[10]  
Demsar J, 2006, J MACH LEARN RES, V7, P1