LipBaB: Computing Exact Lipschitz Constant of ReLU Networks

被引:5
作者
Bhowmick, Aritra [1 ]
D'Souza, Meenakshi [1 ]
Raghavan, G. Srinivasa [1 ]
机构
[1] Int Inst Informat Technol Bangalore, Bangalore, Karnataka, India
来源
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT IV | 2021年 / 12894卷
关键词
D O I
10.1007/978-3-030-86380-7_13
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Lipschitz constant of neural networks plays an important role in several contexts of deep learning ranging from robustness certification and regularization to stability analysis of systems with neural network controllers. Obtaining tight bounds of the Lipschitz constant is therefore important. We introduce LipBaB, a branch and bound framework to compute certified bounds of the local Lipschitz constant of deep neural networks with ReLU activation functions up to any desired precision. It is based on iteratively upper-bounding the norm of the Jacobians, corresponding to different activation patterns of the network caused within the input domain. Our algorithm can provide provably exact computation of the Lipschitz constant for any p-norm.
引用
收藏
页码:151 / 162
页数:12
相关论文
共 10 条
  • [1] Bhowmick A., ARXIV PREPRINT ARXIV
  • [2] Fazlyab Mahyar, 2019, NEURIPS
  • [3] Jordan Matt, 2020, Advances in Neural Information Processing Systems
  • [4] Kim Hyunjik, 2020, ARXIV200604710
  • [5] Latorre F., 2020, ARXIV200408688
  • [6] Virmaux A., 2018, Advances in Neural Information Processing Systems, P3835
  • [7] Wang SQ, 2018, PROCEEDINGS OF THE 27TH USENIX SECURITY SYMPOSIUM, P1599
  • [8] Wang SQ, 2018, ADV NEUR IN, V31
  • [9] Weng T. W., 2018, P 6 INT C LEARN REPR
  • [10] Weng TW, 2018, PR MACH LEARN RES, V80