Active defense mechanism for network security situation prediction based on transformer and TCN

被引:0
作者
Chen, Huanying [1 ]
机构
[1] Henan Qual Polytech, Pingdingshan 467000, Peoples R China
来源
ENGINEERING RESEARCH EXPRESS | 2025年 / 7卷 / 02期
关键词
transformer; TCN; self-attention mechanism; network security; situation prediction;
D O I
10.1088/2631-8695/addc3a
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
With the continuous evolution of cyber-attacks, traditional network security posture prediction methods generally suffer from insufficient predictive accuracy and low computational efficiency in the face of large and dynamically changing data environments. This makes it difficult to meet today's demand for real-time defense and proactive response. In order to solve the above challenges, the research constructs a network security situation prediction model that integrates causal convolution and temporal convolution structures. The model introduces the causal convolution mechanism to enhanace the ability of the attention structure to model the causal and local mutation features in time series. Moreover, it uses the time convolution network to extract the local time features at multiple scales to achieve the synergistic modeling of global and local features. The results showed that the mean absolute error and root mean squared error of the research model for performing network security situation prediction were 0.0085 and 0.0143, respectively. Moreover, the mean time of network security situation prediction value with an average time was 213 ms, which was better than other models. The results demonstrate that the intelligent network security situation prediction model constructed in the study can quickly and accurately predict the network security situation, which enhances the intelligence level of network security defense. This can help network administrators deploy defense strategies in advance and reduce the damage caused by potential attacks.
引用
收藏
页数:15
相关论文
共 21 条
[1]  
Cheng HJ, 2023, IEEE T AFFECT COMPUT, V14, P3149, DOI [10.1109/TII.2022.3232768, 10.1109/TAFFC.2023.3265653]
[2]  
Chuang Ma, 2020, 2020 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery (CyberC), P76, DOI 10.1109/CyberC49757.2020.00022
[3]   FLAD: Adaptive Federated Learning for DDoS attack detection [J].
Doriguzzi-Corin, Roberto ;
Siracusa, Domenico .
COMPUTERS & SECURITY, 2024, 137
[4]   Deep Learning Based System Identification and Nonlinear Model Predictive Control of pH Neutralization Process [J].
Durairaj, Sam ;
Xavier, Joanofarc ;
Patnaik, Sanjib Kumar ;
Panda, Rames C. .
INDUSTRIAL & ENGINEERING CHEMISTRY RESEARCH, 2023, 62 (33) :13061-13080
[5]   Research on Network Security Situation Awareness and Dynamic Game Based on Deep Q Learning Network [J].
Guo, Xian ;
Yang, Jianing ;
Gang, Zhanhui ;
Yang, An .
JOURNAL OF INTERNET TECHNOLOGY, 2023, 24 (02) :549-563
[6]   Testing the Relationship between Word Length, Frequency, and Predictability Based on the German Reference Corpus [J].
Koplenig, Alexander ;
Kupietz, Marc ;
Wolfer, Sascha .
COGNITIVE SCIENCE, 2022, 46 (06)
[7]   Metaverse intrusion detection of wormhole attacks based on a novel statistical mechanism [J].
Kuo, Shu-Yu ;
Tseng, Fan-Hsun ;
Chou, Yao-Hsin .
FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2023, 143 :179-190
[8]   Robustness test of multiple protection strategies for ecological networks from the perspective of complex networks: Evidence from Wuhan Metropolitan Area, China [J].
Lu, Yanchi ;
Liu, Yaolin ;
Xing, Lijun ;
Liu, Yanfang .
LAND DEGRADATION & DEVELOPMENT, 2023, 34 (01) :52-71
[9]   CVTNet: A Cross-View Transformer Network for LiDAR-Based Place Recognition in Autonomous Driving Environments [J].
Ma, Junyi ;
Xiong, Guangming ;
Xu, Jingyi ;
Chen, Xieyuanli .
IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2024, 20 (03) :4039-4048
[10]   EEG-TCNTransformer: A Temporal Convolutional Transformer for Motor Imagery Brain-Computer Interfaces [J].
Nguyen, Anh Hoang Phuc ;
Oyefisayo, Oluwabunmi ;
Pfeffer, Maximilian Achim ;
Ling, Sai Ho .
SIGNALS, 2024, 5 (03) :605-632