On the least amount of training data for a machine learning model

被引:1
作者
Zhao, Dazhi [1 ,2 ]
Hao, Yunquan [1 ]
Li, Weibin [3 ]
Tu, Zhe [4 ]
机构
[1] Southwest Petr Univ, Sch Sci, Chengdu, Peoples R China
[2] Southwest Petr Univ, Inst Artificial Intelligence, Chengdu, Peoples R China
[3] China Aerodynam Res & Dev Ctr, Mianyang 621000, Sichuan, Peoples R China
[4] Zhejiang Wanli Univ, Coll Big Data & Software Engn, Ningbo, Peoples R China
基金
浙江省自然科学基金;
关键词
Machine learning; sampling theorem; frequency principle; signal recovery; neural network; Gaussian process regression; DEEP NEURAL-NETWORKS;
D O I
10.3233/JIFS-211024
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Whether the exact amount of training data is enough for a specific task is an important question in machine learning, since it is always very expensive to label many data while insufficient data lead to underfitting. In this paper, the topic that what is the least amount of training data for a model is discussed from the perspective of sampling theorem. If the target function of supervised learning is taken as a multi-dimensional signal and the labeled data as samples, the training process can be regarded as the process of signal recovery. The main result is that the least amount of training data for a bandlimited task signal corresponds to a sampling rate which is larger than the Nyquist rate. Some numerical experiments are carried out to show the comparison between the learning process and the signal recovery, which demonstrates our result. Based on the equivalence between supervised learning and signal recovery, some spectral methods can be used to reveal underlying mechanisms of various supervised learning models, especially those "black-box" neural networks.
引用
收藏
页码:4891 / 4906
页数:16
相关论文
共 34 条
  • [31] Xu ZQJ, 2024, Arxiv, DOI arXiv:1901.06523
  • [32] Yin D, 2020, Arxiv, DOI arXiv:1906.08988
  • [33] DIFFUSION ON FRACTAL OBJECTS MODELING AND ITS PHYSICS-INFORMED NEURAL NETWORK SOLUTION
    Zhao, Dazhi
    Yu, Guozhu
    Li, Weibin
    [J]. FRACTALS-COMPLEX GEOMETRY PATTERNS AND SCALING IN NATURE AND SOCIETY, 2021, 29 (03)
  • [34] Equivalence between dropout and data augmentation: A mathematical check
    Zhao, Dazhi
    Yu, Guozhu
    Xu, Peng
    Luo, Maokang
    [J]. NEURAL NETWORKS, 2019, 115 : 82 - 89