From Kernel Methods to Neural Networks: A Unifying Variational Formulation

被引:1
|
作者
Unser, Michael [1 ]
机构
[1] Ecole Polytech Fed Lausanne EPFL, Biomed Imaging Grp, Stn 17, CH-1015 Lausanne, Switzerland
关键词
Machine learning; Convex optimization; Regularization; Representer theorem; Kernel methods; Neural networks; Banach space; APPROXIMATION; SPLINES; INTERPOLATION; REGRESSION; TRANSFORM;
D O I
10.1007/s10208-023-09624-9
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
The minimization of a data-fidelity term and an additive regularization functional gives rise to a powerful framework for supervised learning. In this paper, we present a unifying regularization functional that depends on an operator L\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\textrm{L}$$\end{document} and on a generic Radon-domain norm. We establish the existence of a minimizer and give the parametric form of the solution(s) under very mild assumptions. When the norm is Hilbertian, the proposed formulation yields a solution that involves radial-basis functions and is compatible with the classical methods of machine learning. By contrast, for the total-variation norm, the solution takes the form of a two-layer neural network with an activation function that is determined by the regularization operator. In particular, we retrieve the popular ReLU networks by letting the operator be the Laplacian. We also characterize the solution for the intermediate regularization norms ||center dot||=||center dot||Lp\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\Vert \cdot \Vert =\Vert \cdot \Vert _{L_p}$$\end{document} with p is an element of(1,2]\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$p\in (1,2]$$\end{document}. Our framework offers guarantees of universal approximation for a broad family of regularization operators or, equivalently, for a wide variety of shallow neural networks, including the cases (such as ReLU) where the activation function is increasing polynomially. It also explains the favorable role of bias and skip connections in neural architectures.
引用
收藏
页码:1779 / 1818
页数:40
相关论文
共 50 条
  • [41] Blind signal separation methods for integration of neural networks results
    Szupiluk, Ryszard
    Wojewnik, Piotr
    Zabkowski, Tomasz
    2006 9TH INTERNATIONAL CONFERENCE ON INFORMATION FUSION, VOLS 1-4, 2006, : 371 - 376
  • [42] Spectral Analysis of the Neural Tangent Kernel for Deep Residual Networks
    Belfer, Yuval
    Geifman, Amnon
    Galun, Meirav
    Basri, Ronen
    JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25 : 1 - 49
  • [43] Solving complementarity and variational inequalities problems using neural networks
    Yashtini, M.
    Malek, A.
    APPLIED MATHEMATICS AND COMPUTATION, 2007, 190 (01) : 216 - 230
  • [44] Neural networks for a class of bi-level variational inequalities
    M. H. Xu
    M. Li
    C. C. Yang
    Journal of Global Optimization, 2009, 44 : 535 - 552
  • [45] Neural networks for a class of bi-level variational inequalities
    Xu, M. H.
    Li, M.
    Yang, C. C.
    JOURNAL OF GLOBAL OPTIMIZATION, 2009, 44 (04) : 535 - 552
  • [46] VarNet: Variational Neural Networks for the Solution of Partial Differential Equations
    Khodayi-mehr, Reza
    Zavlanos, Michael M.
    LEARNING FOR DYNAMICS AND CONTROL, VOL 120, 2020, 120 : 298 - 307
  • [47] Statistical Methods and Artificial Neural Networks
    Mammadov, Mammadagha
    Yazici, Berna
    Yolacan, Senay
    Aslanargun, Atilla
    Yuzer, Ali Fuat
    Agaoglu, Embiya
    JOURNAL OF MODERN APPLIED STATISTICAL METHODS, 2006, 5 (02) : 495 - 512
  • [49] Software engineering methods for neural networks
    Senyard, A
    Kazmierczak, E
    Sterling, L
    ASIA-PACIFIC SOFTWARE ENGINEERING CONFERENCE, PROCEEDINGS, 2003, : 468 - 477
  • [50] Methods for Pruning Deep Neural Networks
    Vadera, Sunil
    Ameen, Salem
    IEEE ACCESS, 2022, 10 : 63280 - 63300