From Kernel Methods to Neural Networks: A Unifying Variational Formulation

被引:1
|
作者
Unser, Michael [1 ]
机构
[1] Ecole Polytech Fed Lausanne EPFL, Biomed Imaging Grp, Stn 17, CH-1015 Lausanne, Switzerland
关键词
Machine learning; Convex optimization; Regularization; Representer theorem; Kernel methods; Neural networks; Banach space; APPROXIMATION; SPLINES; INTERPOLATION; REGRESSION; TRANSFORM;
D O I
10.1007/s10208-023-09624-9
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
The minimization of a data-fidelity term and an additive regularization functional gives rise to a powerful framework for supervised learning. In this paper, we present a unifying regularization functional that depends on an operator L\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\textrm{L}$$\end{document} and on a generic Radon-domain norm. We establish the existence of a minimizer and give the parametric form of the solution(s) under very mild assumptions. When the norm is Hilbertian, the proposed formulation yields a solution that involves radial-basis functions and is compatible with the classical methods of machine learning. By contrast, for the total-variation norm, the solution takes the form of a two-layer neural network with an activation function that is determined by the regularization operator. In particular, we retrieve the popular ReLU networks by letting the operator be the Laplacian. We also characterize the solution for the intermediate regularization norms ||center dot||=||center dot||Lp\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\Vert \cdot \Vert =\Vert \cdot \Vert _{L_p}$$\end{document} with p is an element of(1,2]\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$p\in (1,2]$$\end{document}. Our framework offers guarantees of universal approximation for a broad family of regularization operators or, equivalently, for a wide variety of shallow neural networks, including the cases (such as ReLU) where the activation function is increasing polynomially. It also explains the favorable role of bias and skip connections in neural architectures.
引用
收藏
页码:1779 / 1818
页数:40
相关论文
共 50 条
  • [21] Phase Contrast Image Restoration by Formulating Its Imaging Principle and Reversing the Formulation With Deep Neural Networks
    Han, Liang
    Su, Hang
    Yin, Zhaozheng
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2023, 42 (04) : 1068 - 1082
  • [22] Mine detection using model-trained multi-resolution neural networks and variational methods
    Szymczak, WG
    Guo, WM
    DETECTION AND REMEDIATION TECHNOLOGIES FOR MINES AND MINELIKE TARGETS IV, PTS 1 AND 2, 1999, 3710 : 559 - 569
  • [23] Deep hybrid neural-kernel networks using random Fourier features
    Mehrkanoon, Siamak
    Suykens, Johan A. K.
    NEUROCOMPUTING, 2018, 298 : 46 - 54
  • [24] From Gaussian kernel density estimation to kernel methods
    Wang, Shitong
    Deng, Zhaohong
    Chung, Fu-lai
    Hu, Wenjun
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2013, 4 (02) : 119 - 137
  • [25] From Gaussian kernel density estimation to kernel methods
    Shitong Wang
    Zhaohong Deng
    Fu-lai Chung
    Wenjun Hu
    International Journal of Machine Learning and Cybernetics, 2013, 4 : 119 - 137
  • [26] The kernel-balanced equation for deep neural networks
    Nakazato, Kenichi
    PHYSICA SCRIPTA, 2023, 98 (10)
  • [27] Multi-Kernel Fusion for RBF Neural Networks
    Atif, Syed Muhammad
    Khan, Shujaat
    Naseem, Imran
    Togneri, Roberto
    Bennamoun, Mohammed
    NEURAL PROCESSING LETTERS, 2023, 55 (02) : 1045 - 1069
  • [28] Multi-Kernel Fusion for RBF Neural Networks
    Syed Muhammad Atif
    Shujaat Khan
    Imran Naseem
    Roberto Togneri
    Mohammed Bennamoun
    Neural Processing Letters, 2023, 55 : 1045 - 1069
  • [29] Kernel orthonormalization in radial basis function neural networks
    Kaminski, W
    Strumillo, P
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1997, 8 (05): : 1177 - 1183
  • [30] A Kernel Analysis of Feature Learning in Deep Neural Networks
    Canatar, Abdulkadir
    Pehlevan, Cengiz
    2022 58TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2022,