Extreme Learning Machine based on Rectified Nonlinear Units

被引:0
作者
Peng, Jingtao [1 ]
Chen, Liang [1 ]
Ather, Iqbal Muhammad [1 ]
Yu, Ao [1 ]
机构
[1] Donghua Univ, Coll Informat Sci & Technol, Shanghai 201620, Peoples R China
来源
PROCEEDINGS OF THE 2016 2ND WORKSHOP ON ADVANCED RESEARCH AND TECHNOLOGY IN INDUSTRY APPLICATIONS | 2016年 / 81卷
关键词
Extreme Learning machine; Over-saturation; Rectified Linear Units; Rectified Non-Linear Units;
D O I
暂无
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Traditional Extreme Learning Machine (ELM) networks generally used S-shaped activation function, such as Sigmoid function and Tangent function. However, the problems of slow convergence speed and over-saturation exist. In order to solve the above problems and improve the performance of ELM algorithm, the method of Rectified Non-Linear Units (ReNLUs), combining rectified linear units (ReLUs) with Softplus function method, was proposed. And the ReLUs has the ability of sparse expression and the Softplus possesses smooth and unsaturated features. Experimental results show that the ELM with the method of ReNLUs activation function, the accuracy and time of training and testing have been significantly improved and saved.
引用
收藏
页码:1525 / 1530
页数:6
相关论文
共 9 条
  • [1] Dahl GE, 2013, INT CONF ACOUST SPEE, P8609, DOI 10.1109/ICASSP.2013.6639346
  • [2] GE S, 2010, ADAPTIVE NEURAL NETW
  • [3] Glorot X., 2011, P 14 INT C ARTIFICIA, P315
  • [4] Glorot X, 2010, P 13 INT C ART INT S, P249
  • [5] 一种基于互信息变量选择的极端学习机算法
    韩敏
    刘晓欣
    [J]. 控制与决策, 2014, 29 (09) : 1576 - 1580
  • [6] Huang GB, 2004, IEEE IJCNN, P985
  • [7] ImageNet Classification with Deep Convolutional Neural Networks
    Krizhevsky, Alex
    Sutskever, Ilya
    Hinton, Geoffrey E.
    [J]. COMMUNICATIONS OF THE ACM, 2017, 60 (06) : 84 - 90
  • [8] Advances in extreme learning machines (ELM2012)
    Lendasse, Amaury
    He, Qing
    Miche, Yoan
    Huang, Guang-Bin
    [J]. NEUROCOMPUTING, 2014, 128 : 1 - 3
  • [9] Nair V., 2010, P 27 INT C MACH LEAR, P807