Probabilistic Verification of ReLU Neural Networks via Characteristic Functions

被引:0
|
作者
Pilipovsky, Joshua [1 ]
Sivaramakrishnan, Vignesh [2 ]
Oishi, Meeko M. K. [2 ]
Tsiotras, Panagiotis [1 ]
机构
[1] Georgia Inst Technol, Daniel Guggenheim Sch Aerosp Engn, Atlanta, GA 30332 USA
[2] Univ New Mexico, Dept Elect & Comp Engn, Albuquerque, NM USA
来源
LEARNING FOR DYNAMICS AND CONTROL CONFERENCE, VOL 211 | 2023年 / 211卷
基金
美国国家科学基金会;
关键词
Neural networks; ReLU; verification; characteristic functions; distributional control; ROBUSTNESS;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Verifying the input-output relationships of a neural network to achieve desired performance specifications is a difficult, yet important, problem due to the growing ubiquity of neural nets in many engineering applications. We use ideas from probability theory in the frequency domain to provide probabilistic verification guarantees for ReLU neural networks. Specifically, we interpret a (deep) feedforward neural network as a discrete-time dynamical system over a finite horizon that shapes distributions of initial states, and use characteristic functions to propagate the distribution of the input data through the network. Using the inverse Fourier transform, we obtain the corresponding cumulative distribution function of the output set, which we use to check if the network is performing as expected given any random point from the input set. The proposed approach does not require distributions to have well-defined moments or moment generating functions. We demonstrate our proposed approach on two examples, and compare its performance to related approaches.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Advances in verification of ReLU neural networks
    Ansgar Rössig
    Milena Petkovic
    Journal of Global Optimization, 2021, 81 : 109 - 152
  • [2] Advances in verification of ReLU neural networks
    Rossig, Ansgar
    Petkovic, Milena
    JOURNAL OF GLOBAL OPTIMIZATION, 2021, 81 (01) : 109 - 152
  • [3] Robustness verification of ReLU networks via quadratic programming
    Aleksei Kuvshinov
    Stephan Günnemann
    Machine Learning, 2022, 111 : 2407 - 2433
  • [4] Robustness verification of ReLU networks via quadratic programming
    Kuvshinov, Aleksei
    Gunnemann, Stephan
    MACHINE LEARNING, 2022, 111 (07) : 2407 - 2433
  • [5] nnenum: Verification of ReLU Neural Networks with Optimized Abstraction Refinement
    Bak, Stanley
    NASA FORMAL METHODS (NFM 2021), 2021, 12673 : 19 - 36
  • [6] Characteristic functions and process identification by neural networks
    Dente, JA
    Mendes, RV
    NEURAL NETWORKS, 1997, 10 (08) : 1465 - 1471
  • [7] Locally linear attributes of ReLU neural networks
    Sattelberg, Ben
    Cavalieri, Renzo
    Kirby, Michael
    Peterson, Chris
    Beveridge, Ross
    FRONTIERS IN ARTIFICIAL INTELLIGENCE, 2023, 6
  • [8] Global optimization of objective functions represented by ReLU networks
    Strong, Christopher A.
    Wu, Haoze
    Zeljic, Aleksandar
    Julian, Kyle D.
    Katz, Guy
    Barrett, Clark
    Kochenderfer, Mykel J.
    MACHINE LEARNING, 2023, 112 (10) : 3685 - 3712
  • [9] Random Sketching for Neural Networks With ReLU
    Wang, Di
    Zeng, Jinshan
    Lin, Shao-Bo
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (02) : 748 - 762
  • [10] Rates of approximation by ReLU shallow neural networks
    Mao, Tong
    Zhou, Ding-Xuan
    JOURNAL OF COMPLEXITY, 2023, 79