Extreme Learning Regression for nu Regularization

被引:2
|
作者
Ding, Xiao-Jian [1 ]
Yang, Fan [1 ]
Liu, Jian [1 ]
Cao, Jie [1 ]
机构
[1] Nanjing Univ Finance & Econ, Coll Informat Engn, Nanjing 210007, Peoples R China
基金
中国国家自然科学基金;
关键词
MACHINE; NETWORKS;
D O I
10.1080/08839514.2020.1723863
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Extreme learning machine for regression (ELR), though efficient, is not preferred in time-limited applications, due to the model selection time being large. To overcome this problem, we reformulate ELR to take a new regularization parameter nu (nu-ELR) which is inspired by Scholkopf et al. The regularization in terms of nu is bounded between 0 and 1, and is easier to interpret compared to C. In this paper, we propose using the active set algorithm to solve the quadratic programming optimization problem of nu-ELR. Experimental results on real regression problems show that nu-ELR performs better than ELM, ELR, and nu-SVR, and is computationally efficient compared to other iterative learning models. Additionally, the model selection time of nu-ELR can be significantly shortened.
引用
收藏
页码:378 / 395
页数:18
相关论文
共 50 条
  • [21] Predicting total sediment load transport in rivers using regression techniques, extreme learning and deep learning models
    Shakya, Deepti
    Deshpande, Vishal
    Kumar, Bimlesh
    Agarwal, Mayank
    ARTIFICIAL INTELLIGENCE REVIEW, 2023, 56 (09) : 10067 - 10098
  • [22] An Incremental Dual nu-Support Vector Regression Algorithm
    Yu, Hang
    Lu, Jie
    Zhang, Guangquan
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2018, PT II, 2018, 10938 : 520 - 531
  • [23] 1-Norm extreme learning machine for regression and multiclass classification using Newton method
    Balasundaram, S.
    Gupta, Deepak
    Kapil
    NEUROCOMPUTING, 2014, 128 : 4 - 14
  • [24] K-Means clustering based Extreme Learning ANFIS with improved interpretability for regression problems
    Pramod, C. P.
    Pillai, G. N.
    KNOWLEDGE-BASED SYSTEMS, 2021, 215
  • [25] Parameter-insensitive kernel in extreme learning for non-linear support vector regression
    Frenay, Benoit
    Verleysen, Michel
    NEUROCOMPUTING, 2011, 74 (16) : 2526 - 2531
  • [26] Robust regression with extreme support vectors
    Zhu, Wentao
    Miao, Jun
    Qing, Laiyun
    PATTERN RECOGNITION LETTERS, 2014, 45 : 205 - 210
  • [27] Approximate Bayesian MLP regularization for regression in the presence of noise
    Park, Jung-Guk
    Jo, Sungho
    NEURAL NETWORKS, 2016, 83 : 75 - 85
  • [28] Decentralized Online Linear Regression With the Regularization Parameter and Noises
    Zhang, Xiwei
    Li, Tao
    Fu, Xiaozheng
    2022 17TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION, ROBOTICS AND VISION (ICARCV), 2022, : 84 - 89
  • [29] Causal Learning via Manifold Regularization
    Hill, Steven M.
    Oates, Chris J.
    Blythe, Duncan A.
    Mukherjee, Sach
    JOURNAL OF MACHINE LEARNING RESEARCH, 2019, 20
  • [30] Framelet Kernels With Applications to Support Vector Regression and Regularization Networks
    Zhang, Wei-Feng
    Dai, Dao-Qing
    Yan, Hong
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2010, 40 (04): : 1128 - 1144