Optimal approximation using complex-valued neural networks

被引:0
|
作者
Geuchen, Paul [1 ]
Voigtlaender, Felix [1 ]
机构
[1] KU Eichstatt Ingolstadt, MIDS, Schanz 49, D-85049 Ingolstadt, Germany
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023) | 2023年
关键词
BOUNDS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Complex-valued neural networks (CVNNs) have recently shown promising empirical success, for instance for increasing the stability of recurrent neural networks and for improving the performance in tasks with complex-valued inputs, such as in MRI fingerprinting. While the overwhelming success of Deep Learning in the real-valued case is supported by a growing mathematical foundation, such a foundation is still largely lacking in the complex-valued case. We thus analyze the expressivity of CVNNs by studying their approximation properties. Our results yield the first quantitative approximation bounds for CVNNs that apply to a wide class of activation functions including the popular modReLU and complex cardioid activation functions. Precisely, our results apply to any activation function that is smooth but not polyharmonic on some non-empty open set; this is the natural generalization of the class of smooth and non-polynomial activation functions to the complex setting. Our main result shows that the error for the approximation of C-k-functions scales as m(-k/(2n)) for m -> infinity where m is the number of neurons, k the smoothness of the target function and n is the (complex) input dimension. Under a natural continuity assumption, we show that this rate is optimal; we further discuss the optimality when dropping this assumption. Moreover, we prove that the problem of approximating C-k-functions using continuous approximation methods unavoidably suffers from the curse of dimensionality.
引用
收藏
页数:57
相关论文
共 50 条
  • [1] Quantitative Approximation Results for Complex-Valued Neural Networks
    Caragea, Andrei
    Lee, Dae Gwan
    Maly, Johannes
    Pfander, Goetz
    Voigtlaender, Felix
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2022, 4 (02): : 553 - 580
  • [2] The universal approximation theorem for complex-valued neural networks
    Voigtlaender, Felix
    APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2023, 64 : 33 - 61
  • [3] Complex-valued neural networks
    Department of Electrical Engineering and Information Systems, University of Tokyo, 7-3-1, Hongo, Bunkyo-ku, Tokyo 113-8656, Japan
    IEEJ Trans. Electron. Inf. Syst., 1 (2-8):
  • [4] Complex-Valued Logic for Neural Networks
    Kagan, Evgeny
    Rybalov, Alexander
    Yager, Ronald
    2018 IEEE INTERNATIONAL CONFERENCE ON THE SCIENCE OF ELECTRICAL ENGINEERING IN ISRAEL (ICSEE), 2018,
  • [5] Complex-valued function approximation using an improved BP learning algorithm for wavelet neural networks
    Li, Sufang
    Jiang, Mingyan
    Journal of Computational Information Systems, 2014, 10 (18): : 7985 - 7992
  • [6] Improving Gradient Regularization using Complex-Valued Neural Networks
    Yeats, Eric
    Chen, Yiran
    Li, Hai
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [7] Adaptive complex-valued stepsize based fast learning of complex-valued neural networks
    Zhang, Yongliang
    Huang, He
    NEURAL NETWORKS, 2020, 124 : 233 - 242
  • [8] Entanglement Detection with Complex-Valued Neural Networks
    Yue-Di Qu
    Rui-Qi Zhang
    Shu-Qian Shen
    Juan Yu
    Ming Li
    International Journal of Theoretical Physics, 62
  • [9] Complex-Valued Recurrent Correlation Neural Networks
    Valle, Marcos Eduardo
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2014, 25 (09) : 1600 - 1612
  • [10] Network inversion for complex-valued neural networks
    Ogawa, T
    Kanada, H
    2005 IEEE International Symposium on Signal Processing and Information Technology (ISSPIT), Vols 1 and 2, 2005, : 850 - 855