Growing Self-Organizing Maps for Nonlinear Time-Varying Function Approximation

被引:0
作者
Paulo H. M. Ferreira
Aluízio F. R. Araújo
机构
[1] Universidade Federal de Pernambuco (UFPE),Centro de Informática
来源
Neural Processing Letters | 2020年 / 51卷
关键词
Function approximation; Self-organizing map; Incremental learning; Locally weighted learning; Nonlinear time-varying Function;
D O I
暂无
中图分类号
学科分类号
摘要
Function approximation may be described as the task of modeling the input-output relation and therefore yielding an estimation of the real output function value. In many domains, an ideal learning algorithm needs to approximate nonlinear time-varying functions from a high-dimensional input space and avoid problems from irrelevant or redundant input data. Therefore, the method has to meet three requirements, namely, it must: allow incremental learning to deal with changing functions and changing input distributions; keep the computational cost low; and achieve accurate estimations. In this paper, we explore different approaches to perform function approximation based on the Local Adaptive Receptive Fields Self-Organizing Map (LARFSOM). Local models are built by calculating between the output associated with the winning node and the difference vector between the input vector and the weight vector. These models are combined by using a weighted sum to yield the final approximate value. The topology is adapted in a self-organizing way, and the weight vectors are adjusted in a modified unsupervised learning algorithm for supervised problems. Experiments were carried out on synthetic and real-world datasets. Experimental results indicate that the proposed approaches perform competitively against Support Vector Regression (SVR) and can improve function approximation accuracy and computational cost against the locally weighted interpolation (LWI), a state-of-the-art interpolating algorithm for self-organizing maps.
引用
收藏
页码:1689 / 1714
页数:25
相关论文
共 54 条
  • [21] Torres-Barrán A(2018)Short-term wind speed prediction using signal preprocessing technique and evolutionary support vector regression Neural Process Lett 48 1043-1061
  • [22] Omari A(undefined)undefined undefined undefined undefined-undefined
  • [23] Dorronsoro JR(undefined)undefined undefined undefined undefined-undefined
  • [24] Flentge F(undefined)undefined undefined undefined undefined-undefined
  • [25] Göppert J(undefined)undefined undefined undefined undefined-undefined
  • [26] Rosentiel W(undefined)undefined undefined undefined undefined-undefined
  • [27] Hartono P(undefined)undefined undefined undefined undefined-undefined
  • [28] Hollensen P(undefined)undefined undefined undefined undefined-undefined
  • [29] Trappenberg T(undefined)undefined undefined undefined undefined-undefined
  • [30] Kohonen T(undefined)undefined undefined undefined undefined-undefined