Structured pruning of neural networks for constraints learning

被引:1
|
作者
Cacciola, Matteo [1 ]
Frangioni, Antonio [2 ]
Lodi, Andrea [3 ,4 ]
机构
[1] Polytech Montreal, CERC, Montreal, PQ, Canada
[2] Univ Pisa, Pisa, Italy
[3] Cornell Tech, New York, NY 10044 USA
[4] Technion IIT, New York, NY 10011 USA
基金
加拿大自然科学与工程研究理事会;
关键词
Artificial neural networks; Mixed integer programming; Model compression; Pruning; ANALYTICS;
D O I
10.1016/j.orl.2024.107194
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In recent years, the integration of Machine Learning (ML) models with Operation Research (OR) tools has gained popularity in applications such as cancer treatment, algorithmic configuration, and chemical process optimization. This integration often uses Mixed Integer Programming (MIP) formulations to represent the chosen ML model, that is often an Artificial Neural Networks (ANNs) due to their widespread use. However, ANNs frequently contain a large number of parameters, resulting in MIP formulations impractical to solve. In this paper we showcase the effectiveness of a ANN pruning, when applied to models prior to their integration into MIPs. We discuss why pruning is more suitable in this context than other ML compression techniques, and we highlight the potential of appropriate pruning strategies via experiments on MIPs used to construct adversarial examples to ANNs. Our results demonstrate that pruning offers remarkable reductions in solution times without hindering the quality of the final decision, enabling the resolution of previously unsolvable instances.
引用
收藏
页数:7
相关论文
共 50 条
  • [1] Leveraging Structured Pruning of Convolutional Neural Networks
    Tessier, Hugo
    Gripon, Vincent
    Leonardon, Mathieu
    Arzel, Matthieu
    Bertrand, David
    Hannagan, Thomas
    2022 IEEE WORKSHOP ON SIGNAL PROCESSING SYSTEMS (SIPS), 2022, : 174 - 179
  • [2] Deep Neural Networks Pruning via the Structured Perspective Regularization
    Cacciola, Matteo
    Frangioni, Antonio
    Li, Xinlin
    Lodi, Andrea
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2023, 5 (04): : 1051 - 1077
  • [3] Structured Pruning for Deep Convolutional Neural Networks: A Survey
    He, Yang
    Xiao, Lingao
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (05) : 2900 - 2919
  • [4] Structured pruning of recurrent neural networks through neuron selection
    Wen, Liangjian
    Zhang, Xuanyang
    Bai, Haoli
    Xu, Zenglin
    NEURAL NETWORKS, 2020, 123 : 134 - 141
  • [5] QLP: Deep Q-Learning for Pruning Deep Neural Networks
    Camci, Efe
    Gupta, Manas
    Wu, Min
    Lin, Jie
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (10) : 6488 - 6501
  • [6] Structured Pruning for Efficient Convolutional Neural Networks via Incremental Regularization
    Wang, Huan
    Hu, Xinyi
    Zhang, Qiming
    Wang, Yuehai
    Yu, Lu
    Hu, Haoji
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2020, 14 (04) : 775 - 788
  • [7] Data-Independent Structured Pruning of Neural Networks via Coresets
    Mussay, Ben
    Feldman, Dan
    Zhou, Samson
    Braverman, Vladimir
    Osadchy, Margarita
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (12) : 7829 - 7841
  • [8] ADVANCES IN MORPHOLOGICAL NEURAL NETWORKS: TRAINING, PRUNING AND ENFORCING SHAPE CONSTRAINTS
    Dimitriadis, Nikolaos
    Maragos, Petros
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3825 - 3829
  • [9] SSFP: A Structured Stripe-Filter Pruning Method for Deep Neural Networks
    Liu, Jingjing
    Huang, Lingjin
    Feng, Manlong
    Guo, Aiying
    2024 13TH INTERNATIONAL CONFERENCE ON COMMUNICATIONS, CIRCUITS AND SYSTEMS, ICCCAS 2024, 2024, : 80 - 84
  • [10] Exploring Compute-in-Memory Architecture Granularity for Structured Pruning of Neural Networks
    Meng, Fan-Hsuan
    Wang, Xinxin
    Wang, Ziyu
    Lee, Eric Yeu-Jer
    Lu, Wei D.
    IEEE JOURNAL ON EMERGING AND SELECTED TOPICS IN CIRCUITS AND SYSTEMS, 2022, 12 (04) : 858 - 866