System Identification: A Machine Learning Perspective

被引:0
作者
Chiuso, A. [1 ]
Pillonetto, G. [1 ]
机构
[1] Univ Padua, Dept Informat Engn, I-35131 Padua, Italy
来源
ANNUAL REVIEW OF CONTROL, ROBOTICS, AND AUTONOMOUS SYSTEMS, VOL 2 | 2019年 / 2卷
关键词
system identification; machine learning; kernel methods; Gaussian processes; KERNEL-BASED APPROACH; SUPPORT VECTOR MACHINES; MODEL SELECTION; STOCHASTIC-PROCESSES; MARGINAL LIKELIHOOD; BAYESIAN-ESTIMATION; VARIABLE SELECTION; DYNAMICAL-SYSTEMS; RANK MINIMIZATION; SPARSE ESTIMATION;
D O I
10.1146/annurev-control-053018-023744
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Estimation of functions from sparse and noisy data is a central theme in machine learning. In the last few years, many algorithms have been developed that exploit Tikhonov regularization theory and reproducing kernel Hilbert spaces. These are the so-called kernel-based methods, which include powerful approaches like regularization networks, support vector machines, and Gaussian regression. Recently, these techniques have also gained popularity in the system identification community. In both linear and nonlinear settings, kernels that incorporate information on dynamic systems, such as the smoothness and stability of the input-output map, can challenge consolidated approaches based on parametric model structures. In the classical parametric setting, the complexity of the model (the model order) needs to be chosen, typically from a finite family of alternatives, by trading bias and variance. This (discrete) model order selection step may be critical, especially when the true model does not belong to the model class. In regularization-based approaches, model complexity is controlled by tuning (continuous) regularization parameters, making the model selection step more robust. In this article, we review these new kernel-based system identification approaches and discuss extensions based on nuclear and l(1) norms.
引用
收藏
页码:281 / 304
页数:24
相关论文
共 121 条
  • [1] NEW LOOK AT STATISTICAL-MODEL IDENTIFICATION
    AKAIKE, H
    [J]. IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 1974, AC19 (06) : 716 - 723
  • [2] Akaike H., 1979, 40 STANF U STANF
  • [3] RELATIONSHIP BETWEEN VARIABLE SELECTION AND DATA AUGMENTATION AND A METHOD FOR PREDICTION
    ALLEN, DM
    [J]. TECHNOMETRICS, 1974, 16 (01) : 125 - 127
  • [4] Scale-sensitive dimensions, uniform convergence, and learnability
    Alon, N
    BenDavid, S
    CesaBianchi, N
    Haussler, D
    [J]. JOURNAL OF THE ACM, 1997, 44 (04) : 615 - 631
  • [5] Particle Markov chain Monte Carlo methods
    Andrieu, Christophe
    Doucet, Arnaud
    Holenstein, Roman
    [J]. JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2010, 72 : 269 - 342
  • [6] [Anonymous], ASHRAE T
  • [7] [Anonymous], 2001, The elements of statistical learning: data mining, inference, and prediction
  • [8] Aravkin A, 2014, J MACH LEARN RES, V15, P217
  • [9] Aravkin Aleksandr., 2012, IFAC P VOLUMES, V45, P125, DOI 10.3182/20120711-3-BE-2027.00353
  • [10] The Connection Between Bayesian Estimation of a Gaussian Random Field and RKHS
    Aravkin, Aleksandr Y.
    Bell, Bradley M.
    Burke, James V.
    Pillonetto, Gianluigi
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2015, 26 (07) : 1518 - 1524