A Kernel Perspective for the Decision Boundary of Deep Neural Networks

被引:1
|
作者
Zhang, Yifan [1 ]
Liao, Shizhong [1 ]
机构
[1] Tianjin Univ, Coll Intelligence & Comp, Tianjin 300350, Peoples R China
基金
中国国家自然科学基金;
关键词
deep neural network; kernel method; generalization ability; gradient descent; decision boundary;
D O I
10.1109/ICTAI50040.2020.00105
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep learning has achieved great success in many fields, but they still lack theoretical understandings. Although some recent theoretical and experimental results have investigated the representation power of deep learning, little effort has been devoted to analyzing the generalization ability of deep learning. In this paper, we analyze deep neural networks from a kernel perspective and use kernel methods to investigate the effect of the implicit regularization introduced by gradient descent on the generalization ability. Firstly, we argue that the multi-layer nonlinear feature transformation in deep neural networks is equivalent to a kernel feature mapping and analyze our point from the perspective of the unique mathematical advantages of kernel methods and the method of constructing multi-layer kernel machines, respectively. Secondly, using the representer theorem, we analyze the decision boundary of deep neural networks and prove that the last hidden layers of deep neural networks converge to nonlinear SVMs. Systematical experiments demonstrate that the decision boundaries of neural networks converge to those of nonlinear SVMs.
引用
收藏
页码:653 / 660
页数:8
相关论文
共 50 条
  • [1] A Kernel Perspective for Regularizing Deep Neural Networks
    Bietti, Alberto
    Mialon, Gregoire
    Chen, Dexiong
    Mairal, Julien
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [2] Decision Boundary of Deep Neural Networks: Challenges and Opportunities
    Karimi, Hamid
    Tang, Jiliang
    PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING (WSDM '20), 2020, : 919 - 920
  • [3] Why Do Deep Residual Networks Generalize Better than Deep Feedforward Networks? - A Neural Tangent Kernel Perspective
    Huang, Kaixuan
    Wang, Yuqing
    Tao, Molei
    Zhao, Tuo
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [4] Neural Tangent Kernel Analysis of Deep Narrow Neural Networks
    Lee, Jongmin
    Choi, Joo Young
    Ryu, Ernest K.
    No, Albert
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [5] Decision Boundaries of Deep Neural Networks
    Karimi, Hamid
    Derr, Tyler
    2022 21ST IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS, ICMLA, 2022, : 1085 - 1092
  • [6] Deep neural networks - a developmental perspective
    Juang, Biing Hwang
    APSIPA TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING, 2016, 5
  • [7] Robustness Evaluation and Prioritization Verification for Deep Neural Networks via Decision Boundary Analysis
    Lin R.-H.
    Zhou Q.-L.
    Hu T.-Q.
    Wang Y.-F.
    Jisuanji Xuebao/Chinese Journal of Computers, 2024, 47 (04): : 862 - 876
  • [8] Building a Regular Decision Boundary with Deep Networks
    Oyallon, Edouard
    30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, : 1886 - 1894
  • [9] Decision boundary feature extraction for neural networks
    Lee, C
    Landgrebe, DA
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1997, 8 (01): : 75 - 83
  • [10] On the Neural Tangent Kernel of Deep Networks with Orthogonal Initialization
    Huang, Wei
    Du, Weitao
    Da Xu, Richard Yi
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 2577 - 2583