Exploiting Deep Neural Networks as Covert Channels

被引:0
作者
Pishbin, Hora Saadaat [1 ]
Bidgoly, Amir Jalaly [1 ]
机构
[1] Univ Qom, Dept Informat Technol & Comp Engn, Qom 3716146611, Iran
关键词
Data models; Computational modeling; Deep learning; Receivers; Training; Artificial neural networks; Malware; Trustworthy machine learning; deep neural network; covert channel; deep learning attack; concealment;
D O I
10.1109/TDSC.2023.3300072
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
With the increasing development of deep learning models, the security of these models has become more important. In this work, for the first time, we have investigated the possibility of abusing the deep model as a covert channel. The concept of a covert channel is to use a channel that is not designed for information exchange for transmitting a covert message. This work studies how a deep model can be used by an adversary as a covert channel. The proposed approach is using an end-to-end training deep model called the covert model to produce artificial data which includes some covert messages. This artificial data is the input of the deep model, which is aimed at being exploited as a covert channel, in such a way that the signal will be covered in the output of this model. To achieve indistinguishability of concealment, generative adversarial networks are used. The results show that it is possible to have a covert channel with an acceptable message transmission power in well-known deep models such as the ResNet and InceptionV3 models. Results of case studies indicate the signal-to-noise ratio (SNR) of 12.67, the bit error rate (BER) of 0.08, and the accuracy of the deep model used to hide the signal reaches 92%.
引用
收藏
页码:2115 / 2126
页数:12
相关论文
共 50 条
  • [31] Generalize Deep Neural Networks With Adaptive Regularization for Classifying
    Guo, Kehua
    Tao, Ze
    Zhang, Lingyan
    Hu, Bin
    Kui, Xiaoyan
    [J]. IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2024, 11 (01) : 1216 - 1229
  • [32] A Formal Characterization of Activation Functions in Deep Neural Networks
    Amrouche, Massi
    Stipanovic, Dusan M.
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (02) : 2153 - 2166
  • [33] ES Attack: Model Stealing Against Deep Neural Networks Without Data Hurdles
    Yuan, Xiaoyong
    Ding, Leah
    Zhang, Lan
    Li, Xiaolin
    Wu, Dapeng Oliver
    [J]. IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2022, 6 (05): : 1258 - 1270
  • [34] Backdoor Attacks to Deep Neural Networks: A Survey of the Literature, Challenges, and Future Research Directions
    Mengara, Orson
    Avila, Anderson
    Falk, Tiago H.
    [J]. IEEE ACCESS, 2024, 12 : 29004 - 29023
  • [35] Federated Learning With Deep Neural Networks: A Privacy-Preserving Approach to Enhanced ECG Classification
    Weimann, Kuba
    Conrad, Tim O. F.
    [J]. IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2024, 28 (11) : 6931 - 6943
  • [36] Hierarchical Training of Deep Neural Networks Using Early Exiting
    Sepehri, Yamin
    Pad, Pedram
    Yuzuguler, Ahmet Caner
    Frossard, Pascal
    Dunbar, L. Andrea
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, : 1 - 15
  • [37] Toward Energy-Quality Scaling in Deep Neural Networks
    Anderson, Jeff
    Alkabani, Yousra
    El-Ghazawi, Tarek
    [J]. IEEE DESIGN & TEST, 2021, 38 (04) : 27 - 36
  • [38] Defenses Against Byzantine Attacks in Distributed Deep Neural Networks
    Xia, Qi
    Tao, Zeyi
    Li, Qun
    [J]. IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2021, 8 (03): : 2025 - 2035
  • [39] Magnetic Field Simulation of Reactor Based on Deep Neural Networks
    Peng, Qingjun
    Zheng, Zezhong
    Zhu, Haowei
    Ma, Pengcheng
    Han, Zhixuan
    Li, Zhongnian
    Hu, Jinchi
    Wang, Qun
    [J]. IEEE TRANSACTIONS ON POWER DELIVERY, 2023, 38 (03) : 2224 - 2227
  • [40] Automatic Design of Deep Graph Neural Networks With Decoupled Mode
    Tao, Qian
    Cai, Rongshen
    Lin, Zicheng
    Tang, Yufei
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024,