TFHSVul: A Fine-Grained Hybrid Semantic Vulnerability Detection Method Based on Self-Attention Mechanism in IoT

被引:0
|
作者
Xu, Lijuan [1 ]
An, Baolong [1 ]
Li, Xin [1 ]
Zhao, Dawei [1 ]
Peng, Haipeng [2 ,3 ]
Song, Weizhao [1 ]
Tong, Fenghua [1 ]
Han, Xiaohui [1 ]
机构
[1] Qilu Univ Technol, Shandong Acad Sci, Shandong Comp Sci Ctr, Natl Supercomp Ctr Jinan,Key Lab Comp Power Networ, Jinan 250014, Peoples R China
[2] Beijing Univ Posts & Telecommun, Informat Secur Ctr, State Key Lab Networking & Switching Technol, Beijing 100876, Peoples R China
[3] Beijing Univ Posts & Telecommun, Natl Engn Lab Disaster Backup & Recovery, Beijing 100876, Peoples R China
来源
IEEE INTERNET OF THINGS JOURNAL | 2025年 / 12卷 / 01期
基金
中国国家自然科学基金;
关键词
Codes; Internet of Things; Feature extraction; Source coding; Security; Accuracy; Semantics; Deep learning; network security; software vulnerabilities; vulnerability detection;
D O I
10.1109/JIOT.2024.3459921
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Current vulnerability detection methods encounter challenges, such as inadequate feature representation, constrained feature extraction capabilities, and coarse-grained detection. To address these issues, we propose a fine-grained hybrid semantic vulnerability detection framework based on Transformer, named TFHSVul. Initially, the source code is transformed into sequential and graph-based representations to capture multilevel features, thereby solving the problem of insufficient information caused by a single intermediate representation. To enhance feature extraction capabilities, TFHSVul integrates multiscale fusion convolutional neural network, residual graph convolutional network, and pretrained language model into the core architecture, significantly boosting performance. We design a fine-grained detection method based on a self-attention mechanism, achieving statement-level detection to address the issue of coarse detection granularity. In comparison to existing baseline methods on public data sets, TFHSVul achieves a 0.58 improvement in F1 score at the function level compared to the best performing model. Moreover, it demonstrates a 10% enhancement in Top-10 accuracy at the statement-level detection compared to the best performing method.
引用
收藏
页码:30 / 44
页数:15
相关论文
共 50 条
  • [1] A fine-grained classification method based on self-attention Siamese network
    He Can
    Yuan Guowu
    Wu Hao
    2021 THE 5TH INTERNATIONAL CONFERENCE ON VIDEO AND IMAGE PROCESSING, ICVIP 2021, 2021, : 148 - 154
  • [2] Self-Attention based fine-grained cross-media hybrid network
    Shan, Wei
    Huang, Dan
    Wang, Jiangtao
    Zou, Feng
    Li, Suwen
    PATTERN RECOGNITION, 2022, 130
  • [3] Person re-identification method based on fine-grained feature fusion and self-attention mechanism
    Yin, Kangning
    Ding, Zhen
    Dong, Zhihua
    Ji, Xinhui
    Wang, Zhipei
    Chen, Dongsheng
    Li, Ye
    Yin, Guangqiang
    Wang, Zhiguo
    COMPUTING, 2024, 106 (05) : 1681 - 1705
  • [4] Person re-identification method based on fine-grained feature fusion and self-attention mechanism
    Kangning Yin
    Zhen Ding
    Zhihua Dong
    Xinhui Ji
    Zhipei Wang
    Dongsheng Chen
    Ye Li
    Guangqiang Yin
    Zhiguo Wang
    Computing, 2024, 106 : 1681 - 1705
  • [5] Multigranularity Self-Attention Network for Fine-Grained Ship Detection in Remote Sensing Images
    Ouyang, Lihan
    Fang, Leyuan
    Ji, Xinyu
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2022, 15 : 9722 - 9732
  • [6] Spatial self-attention network with self-attention distillation for fine-grained image recognitionx2729;
    Baffour, Adu Asare
    Qin, Zhen
    Wang, Yong
    Qin, Zhiguang
    Choo, Kim-Kwang Raymond
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2021, 81
  • [7] Hierarchical Attention Network for Interpretable and Fine-Grained Vulnerability Detection
    Gu, Mianxue
    Feng, Hantao
    Sun, Hongyu
    Liu, Peng
    Yue, Qiuling
    Hu, Jinglu
    Cao, Chunjie
    Zhang, Yuqing
    IEEE INFOCOM 2022 - IEEE CONFERENCE ON COMPUTER COMMUNICATIONS WORKSHOPS (INFOCOM WKSHPS), 2022,
  • [8] Fine-grained entity type classification using GRU with self-attention
    Dhrisya K.
    Remya G.
    Mohan A.
    International Journal of Information Technology, 2020, 12 (3) : 869 - 878
  • [9] Fine-grained image classification method based on hybrid attention module
    Lu, Weixiang
    Yang, Ying
    Yang, Lei
    FRONTIERS IN NEUROROBOTICS, 2024, 18
  • [10] A biomedical event extraction method based on fine-grained and attention mechanism
    He, Xinyu
    Tai, Ping
    Lu, Hongbin
    Huang, Xin
    Ren, Yonggong
    BMC BIOINFORMATICS, 2022, 23 (01)