CCBLA: a Lightweight Phishing Detection Model Based on CNN, BiLSTM, and Attention Mechanism

被引:10
|
作者
Zhu, Erzhou [1 ]
Yuan, Qixiang [1 ]
Chen, Zhile [1 ]
Li, Xuejian [1 ]
Fang, Xianyong [1 ]
机构
[1] Anhui Univ, Key Lab Intelligent Comp & Signal Proc, Minist Educ, Sch Comp Sci & Technol, Hefei 230601, Peoples R China
关键词
Phishing detection; Deep learning; Neural network; Attention mechanism; FEATURE-SELECTION;
D O I
10.1007/s12559-022-10024-4
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Phishing, in which social engineering techniques such as emails and instant messaging are employed and malicious links are disguised as normal URLs to steal sensitive information, is currently a major threat to networks worldwide. Phishing detection systems generally adopt feature engineering as one of the most important approaches to detect or even prevent phishing attacks. However, the accuracy of feature engineering systems is heavily dependent on the prior knowledge of features. In addition, extracting comprehensive features from different dimensions for high detection accuracy is time-consuming. To address these issues, this paper proposes a lightweight model that combines convolutional neural network (CNN), bi-directional long short-term memory (BiLSTM), and the attention mechanism for phishing detection. The proposed model, called the char-convolutional and BiLSTM with attention mechanism (CCBLA) model, employs deep learning to automatically extract features from target URLs and uses the attention mechanism to weight the importance of the selected features under different roles during phishing detection. The results of experiments conducted on two datasets with different scales show that CCBLA is accurate in phishing attack detection with minimal time consumption.
引用
收藏
页码:1320 / 1333
页数:14
相关论文
共 50 条
  • [41] BiCHAT: BiLSTM with deep CNN and hierarchical attention for hate speech detection
    Khan, Shakir
    Fazil, Mohd
    Sejwal, Vineet Kumar
    Alshara, Mohammed Ali
    Alotaibi, Reemiah Muneer
    Kamal, Ashraf
    Baig, Abdul Rauf
    JOURNAL OF KING SAUD UNIVERSITY-COMPUTER AND INFORMATION SCIENCES, 2022, 34 (07) : 4335 - 4344
  • [42] Curve-based lane estimation model with lightweight attention mechanism
    Jindong Zhang
    Haoting Zhong
    Signal, Image and Video Processing, 2023, 17 : 2637 - 2643
  • [43] A Maturity Detection Method for Hemerocallis Citrina Baroni Based on Lightweight and Attention Mechanism
    Sheng, Bin
    Wu, Ligang
    Zhang, Nan
    APPLIED SCIENCES-BASEL, 2023, 13 (21):
  • [44] Advancing PPG-based cf-PWV estimation with an integrated CNN-BiLSTM-Attention model
    Abrisham, Kiana Pilevar
    Alipour, Khalil
    Tarvirdizadeh, Bahram
    Ghamari, Mohammad
    SIGNAL IMAGE AND VIDEO PROCESSING, 2024, 18 (12) : 8621 - 8633
  • [45] Lightweight CNN-BiLSTM based Intrusion Detection Systems for Resource-Constrained IoT Devices
    Jouhari, Mohammed
    Guizani, Mohsen
    20TH INTERNATIONAL WIRELESS COMMUNICATIONS & MOBILE COMPUTING CONFERENCE, IWCMC 2024, 2024, : 1558 - 1563
  • [46] Lightweight intrusion detection model based on CNN and knowledge distillation
    Wang, Long-Hui
    Dai, Qi
    Du, Tony
    Chen, Li-fang
    APPLIED SOFT COMPUTING, 2024, 165
  • [47] Reliable social media framework: fake news detection using modified feature attention based CNN-BiLSTM
    Srikanth, D.
    Prasad, K. Krishna
    Kannan, M.
    Kanchana, D.
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, : 2971 - 2996
  • [48] Short-term load forecasting based on CNN-BiLSTM with Bayesian optimization and attention mechanism
    Shi, Huifeng
    Miao, Kai
    Ren, Xiaochen
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2023, 35 (17)
  • [49] A Lightweight Detector Based on Attention Mechanism for Fabric Defect Detection
    Luo, Xin
    Ni, Qing
    Tao, Ran
    Shi, Youqun
    IEEE ACCESS, 2023, 11 : 33554 - 33569
  • [50] Lightweight SAR ship detection algorithm based on attention mechanism
    Fu, Weihong
    Zheng, Peiyuan
    SIGNAL IMAGE AND VIDEO PROCESSING, 2025, 19 (01)