A smoothed monotonic regression via L2 regularization

被引:17
作者
Sysoev, Oleg [1 ]
Burdakov, Oleg [2 ]
机构
[1] Linkoping Univ, Dept Comp & Informat Sci, Div Stat & Machine Learning, S-58183 Linkoping, Sweden
[2] Linkoping Univ, Dept Math, S-58183 Linkoping, Sweden
关键词
Monotonic regression; Kernel smoothing; Penalized regression; Probabilistic learning; P-SPLINES; ALGORITHM; INTERVALS;
D O I
10.1007/s10115-018-1201-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Monotonic regression is a standard method for extracting a monotone function from non-monotonic data, and it is used in many applications. However, a known drawback of this method is that its fitted response is a piecewise constant function, while practical response functions are often required to be continuous. The method proposed in this paper achieves monotonicity and smoothness of the regression by introducing an L2 regularization term. In order to achieve a low computational complexity and at the same time to provide a high predictive power of the method, we introduce a probabilistically motivated approach for selecting the regularization parameters. In addition, we present a technique for correcting inconsistencies on the boundary. We show that the complexity of the proposed method is O(n2). Our simulations demonstrate that when the data are large and the expected response is a complicated function (which is typical in machine learning applications) or when there is a change point in the response, the proposed method has a higher predictive power than many of the existing methods.
引用
收藏
页码:197 / 218
页数:22
相关论文
共 44 条
[1]   Nonlinear image estimation using piecewise and local image models [J].
Acton, ST ;
Bovik, AC .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 1998, 7 (07) :979-991
[2]   Nonparametric option pricing under shape restrictions [J].
Aït-Sahalia, Y ;
Duarte, J .
JOURNAL OF ECONOMETRICS, 2003, 116 (1-2) :9-47
[3]  
[Anonymous], 2016, P INT C IND POS IND
[4]  
[Anonymous], 1994, INTRO BOOTSTRAP
[5]  
[Anonymous], 2001, SPRINGE SER STAT N
[6]  
[Anonymous], 2012, MACHINE LEARNING PRO
[7]  
Asuncion A., 2007, Uci machine learning repository, university of california, irvine, school of information and computer sciences
[8]   AN EMPIRICAL DISTRIBUTION FUNCTION FOR SAMPLING WITH INCOMPLETE INFORMATION [J].
AYER, M ;
BRUNK, HD ;
EWING, GM ;
REID, WT ;
SILVERMAN, E .
ANNALS OF MATHEMATICAL STATISTICS, 1955, 26 (04) :641-647
[9]  
Barlow Richard E, 1972, Statistical Inference Under Order Restrictions: The Theory and Application of Isotonic Regression
[10]   EQUIVALENCE OF SOME QUADRATIC-PROGRAMMING ALGORITHMS [J].
BEST, MJ .
MATHEMATICAL PROGRAMMING, 1984, 30 (01) :71-87