A Difference of Convex Functions Algorithm for Switched Linear Regression

被引:23
作者
Tao Pham Dinh [1 ]
Hoai Minh Le [2 ]
Hoai An Le Thi [2 ]
Lauer, Fabien [3 ]
机构
[1] Normandie Univ, Natl Inst Appl Sci Rouen, Math Lab, F-76801 St Etienne Du Rouvray, France
[2] Univ Lorraine, Lab Theoret & Appl Comp Sci LITA EA 3097, F-57045 Metz, France
[3] Univ Lorraine, CNRS, LORIA, Inria, F-54506 Vandoeuvre Les Nancy, France
关键词
DC programming; DCA; nonconvex optimization; nonsmooth optimization; piecewise affine systems; switched linear systems; switched regression; system identification; IDENTIFICATION; OPTIMIZATION;
D O I
10.1109/TAC.2014.2301575
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This technical note deals with switched linear system identification and more particularly aims at solving switched linear regression problems in a large-scale setting with both numerous data and many parameters to learn. We consider the recent minimum-of-error framework with a quadratic loss function, in which an objective function based on a sum of minimum errors with respect to multiple submodels is to be minimized. The technical note proposes a new approach to the optimization of this nonsmooth and nonconvex objective function, which relies on Difference of Convex (DC) functions programming. In particular, we formulate a proper DC decomposition of the objective function, which allows us to derive a computationally efficient DC algorithm. Numerical experiments show that the method can efficiently and accurately learn switching models in large dimensions and from many data points.
引用
收藏
页码:2277 / 2282
页数:6
相关论文
共 19 条