Distance regression by Gauss-Newton-type methods and iteratively re-weighted least-squares

被引:4
|
作者
Aigner, Martin [1 ]
Juettler, Bert [1 ]
机构
[1] Johannes Kepler Univ Linz, Inst Appl Geometry, A-4040 Linz, Austria
关键词
Curve and surface fitting; Iteratively re-weighted least squares; Gauss-Newton method; Fitting by evolution; SPLINE CURVES; SURFACE; APPROXIMATION; MINIMIZATION;
D O I
10.1007/s00607-009-0055-6
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
We discuss the problem of fitting a curve or surface to given measurement data. In many situations, the usual least-squares approach (minimization of the sum of squared norms of residual vectors) is not suitable, as it implicitly assumes a Gaussian distribution of the measurement errors. In those cases, it is more appropriate to minimize other functions (which we will call norm-like functions) of the residual vectors. This is well understood in the case of scalar residuals, where the technique of iteratively re-weighted least-squares, which originated in statistics (Huber in Robust statistics, 1981) is known to be a Gauss-Newton-type method for minimizing a sum of norm-like functions of the residuals. We extend this result to the case of vector-valued residuals. It is shown that simply treating the norms of the vector-valued residuals as scalar ones does not work. In order to illustrate the difference we provide a geometric interpretation of the iterative minimization procedures as evolution processes.
引用
收藏
页码:73 / 87
页数:15
相关论文
共 50 条
  • [1] Distance regression by Gauss–Newton-type methods and iteratively re-weighted least-squares
    Martin Aigner
    Bert Jüttler
    Computing, 2009, 86 : 73 - 87
  • [2] ROBUST REGRESSION USING ITERATIVELY RE-WEIGHTED LEAST-SQUARES
    HOLLAND, PW
    WELSCH, RE
    COMMUNICATIONS IN STATISTICS PART A-THEORY AND METHODS, 1977, 6 (09): : 813 - 827
  • [3] Conjugate gradient acceleration of iteratively re-weighted least squares methods
    Fornasier, Massimo
    Peter, Steffen
    Rauhut, Holger
    Worm, Stephan
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2016, 65 (01) : 205 - 259
  • [4] Conjugate gradient acceleration of iteratively re-weighted least squares methods
    Massimo Fornasier
    Steffen Peter
    Holger Rauhut
    Stephan Worm
    Computational Optimization and Applications, 2016, 65 : 205 - 259
  • [5] Efficient Privacy-Preserving Logistic Regression with Iteratively Re-weighted Least Squares
    Kikuchi, Hiroaki
    Yasunaga, Hideo
    Matsui, Hiroki
    Fan, Chun-I
    2016 11TH ASIA JOINT CONFERENCE ON INFORMATION SECURITY (ASIAJCIS), 2016, : 48 - 54
  • [6] Convergence and Stability of Iteratively Re-weighted Least Squares Algorithms
    Ba, Demba
    Babadi, Behtash
    Purdon, Patrick L.
    Brown, Emery N.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2014, 62 (01) : 183 - 195
  • [7] Robust Data Whitening as an Iteratively Re-weighted Least Squares Problem
    Mukundan, Arun
    Tolias, Giorgos
    Chum, Ondrej
    IMAGE ANALYSIS, SCIA 2017, PT I, 2017, 10269 : 234 - 247
  • [8] Convergence of Iteratively Re-weighted Least Squares to Robust M-estimators
    Aftab, Khurrum
    Hartley, Richard
    2015 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2015, : 480 - 487
  • [9] Solving Robust Regularization Problems using Iteratively Re-Weighted Least Squares
    Kiani, Khurrum Aftab
    Drummond, Tom
    2017 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2017), 2017, : 483 - 492