Explainable AI supported hybrid deep learnig method for layer 2 intrusion detection

被引:0
|
作者
Kilincer, Ilhan Firat [1 ]
机构
[1] Firat Univ, Digital Forens Engn, Elazig, Turkiye
关键词
IDS; Deep Learning; Explainable AI;
D O I
10.1016/j.eij.2025.100669
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With rapidly developing technology, digitalization environments are also expanding. Although this situation has many positive effects on daily life, the security vulnerabilities brought about by digitalization continue to be a major concern. There is a large network structure behind many applications provided to users by organizations. A substantial network infrastructure exists behind numerous applications made available to users by organisations. It is imperative that these extensive network infrastructures, which often contain sensitive data including personal, commercial, financial and security information, possess the capability to impede cyberattacks. This study proposes the creation of a Comprehensive Layer 2 - IDS (CL2-IDS) dataset for the development of IDS systems utilised in the local network structures of organisations, in conjunction with a hybrid deep learning (DL) model for the detection of attack vectors in the proposed dataset. The proposed hybrid model is obtained by using CNN (Convolutional Neural Networks) and Bi-LSTM (Bidirectional Long Short-Term Memory) models, which are widely used in areas such as image analysis and time series data. The proposed hybrid DL model achieved an accuracy of 95.28% in the classification of the CL2-IDS dataset. It is observed that the combination of these two deep learning models, which complement each other in various ways, yields successful results in the classification of the proposed CL2-IDS dataset. In the last part of the study, the effect of the features in the CL2IDS dataset on the classification is interpreted with SHapley Additive exPlanations (SHAP), an Explainable Artificial Intelligence (XAI) method. The study, CL2-IDS dataset and hybrid DL model, combinations of CNN and Bi-LSTM algorithms, facilitates the intrusion detection and exemplifies how DL models and XAI techniques can be used to support IDS systems.
引用
收藏
页数:13
相关论文
共 50 条
  • [31] A General-Purpose Method for Applying Explainable AI for Anomaly Detection
    Sipple, John
    Youssef, Abdou
    FOUNDATIONS OF INTELLIGENT SYSTEMS (ISMIS 2022), 2022, 13515 : 162 - 174
  • [32] XI2S-IDS: An Explainable Intelligent 2-Stage Intrusion Detection System
    Mahmoud, Maiada M.
    Youssef, Yasser Omar
    Abdel-Hamid, Ayman A.
    FUTURE INTERNET, 2025, 17 (01)
  • [33] TEA-EKHO-IDS: An intrusion detection system for industrial CPS with trustworthy explainable AI and enhanced krill herd optimization
    Sivamohan, S.
    Sridhar, S. S.
    Krishnaveni, S.
    PEER-TO-PEER NETWORKING AND APPLICATIONS, 2023, 16 (04) : 1993 - 2021
  • [34] TEA-EKHO-IDS: An intrusion detection system for industrial CPS with trustworthy explainable AI and enhanced krill herd optimization
    S. Sivamohan
    S. S. Sridhar
    S. Krishnaveni
    Peer-to-Peer Networking and Applications, 2023, 16 : 1993 - 2021
  • [35] E-XAI: Evaluating Black-Box Explainable AI Frameworks for Network Intrusion Detection
    Arreche, Osvaldo
    Guntur, Tanish R.
    Roberts, Jack W.
    Abdallah, Mustafa
    IEEE ACCESS, 2024, 12 : 23954 - 23988
  • [36] Hybrid optimization and deep learning based intrusion detection system
    Gupta, Subham Kumar
    Tripathi, Meenakshi
    Grover, Jyoti
    COMPUTERS & ELECTRICAL ENGINEERING, 2022, 100
  • [37] Enhancing intrusion detection: a hybrid machine and deep learning approach
    Sajid, Muhammad
    Malik, Kaleem Razzaq
    Almogren, Ahmad
    Malik, Tauqeer Safdar
    Khan, Ali Haider
    Tanveer, Jawad
    Rehman, Ateeq Ur
    JOURNAL OF CLOUD COMPUTING-ADVANCES SYSTEMS AND APPLICATIONS, 2024, 13 (01):
  • [38] Explainable deep learning method for laser welding defect detection
    Liu T.
    Zheng H.
    Yang C.
    Bao J.
    Wang J.
    Gu J.
    Hangkong Xuebao/Acta Aeronautica et Astronautica Sinica, 2022, 43 (04):
  • [39] Explainable AI: A Hybrid Approach to Generate Human-Interpretable Explanation for Deep Learning Prediction
    De, Tanusree
    Giri, Prasenjit
    Mevawala, Ahmeduvesh
    Nemani, Ramyasri
    Deo, Arati
    COMPLEX ADAPTIVE SYSTEMS, 2020, 168 : 40 - 48
  • [40] Battery state-of-health estimation: An ultrasonic detection method with explainable AI
    Liu, Kailong
    Fang, Jingyang
    Zhao, Shiwen
    Liu, Yuhang
    Dai, Haifeng
    Ye, Liwang
    Peng, Qiao
    ENERGY, 2025, 319