HOBAT: Batch Verification for Homogeneous Structural Neural Networks

被引:0
作者
Li, Jingyang [1 ]
Li, Guoqiang [1 ]
机构
[1] Shanghai Jiao Tong Univ, Sch Software, Shanghai 200240, Peoples R China
来源
2023 38TH IEEE/ACM INTERNATIONAL CONFERENCE ON AUTOMATED SOFTWARE ENGINEERING, ASE | 2023年
关键词
Batch verification; Neural networks; Homogeneous structure; Abstraction;
D O I
10.1109/ASE56229.2023.00033
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The rapid development of deep learning has significantly transformed the ecology of the software engineering field. As new data continues to grow and evolve at an explosive rate, the challenge of iteratively updating software built on neural networks has become a critical issue. While the continuous learning paradigm enables networks to incorporate new data and update accordingly without losing previous memories, resulting in a batch of new networks as candidates for software updating, these approaches merely select from these networks by empirically testing their accuracy; they lack formal guarantees for such a batch of networks, especially in the presence of adversarial samples. Existing verification techniques, based on constraint solving, interval propagation, and linear approximation, provide formal guarantees but are designed to verify the properties of individual networks rather than a batch of networks. To address this issue, we analyze the batch verification problem corresponding to several non-traditional machine learning paradigms and further propose a framework named HOBAT (BATch verification for HOmogeneous structural neural networks) to enhance batch verification under reasonable assumptions about the representation of homogeneous structure neural networks, increasing scalability in practical applications. Our method involves abstracting the neurons at the same position in a batch of networks into a single neuron, followed by an iterative refinement process on the abstracted neuron to restore the precision until the desired properties for verification are met. Our method is orthogonal to boundary propagation verification on a single neural network. To assess our methodology, we integrate it with boundary propagation verification and observe significant improvements compared to the vanilla approach. Our experiments demonstrate the enormous potential for verifying large batches of networks in the era of big data.
引用
收藏
页码:1276 / 1287
页数:12
相关论文
共 29 条
  • [21] Continual lifelong learning with neural networks: A review
    Parisi, German I.
    Kemker, Ronald
    Part, Jose L.
    Kanan, Christopher
    Wermter, Stefan
    [J]. NEURAL NETWORKS, 2019, 113 : 54 - 71
  • [22] An Abstract Domain for Certifying Neural Networks
    Singh, Gagandeep
    Gehr, Timon
    Puschel, Markus
    Vechev, Martin
    [J]. PROCEEDINGS OF THE ACM ON PROGRAMMING LANGUAGES-PACMPL, 2019, 3 (POPL):
  • [23] Singh G, 2018, ADV NEUR IN, V31
  • [24] Szegedy C, 2014, Arxiv, DOI arXiv:1312.6199
  • [25] Wang Q., 2020, CoRR
  • [26] Wang SQ, 2018, PROCEEDINGS OF THE 27TH USENIX SECURITY SYMPOSIUM, P1599
  • [27] Wang SQ, 2018, ADV NEUR IN, V31
  • [28] Wang Shiqi, 2021, ADV NEUR IN, V34
  • [29] Zhang H, 2018, ADV NEUR IN, V31