LETFORMER: LIGHTWEIGHT TRANSFORMER PRE-TRAINING WITH SHARPNESS-AWARE OPTIMIZATION FOR EFFICIENT ENCRYPTED TRAFFIC ANALYSIS

被引:0
|
作者
Meng, Zhiyan [1 ]
Liu, Dan [1 ]
Meng, Jintao [2 ]
机构
[1] Research Institute of Electronic Science and Technology, University of Electronic Science and Technology of China, No. 2006, Xiyuan Avenue, West Hi-Tech Zone, Chengdu
[2] National Key Laboratory of Security Communication, No. 35, Huangjing Road, Shuangliu County, Chengdu
来源
International Journal of Innovative Computing, Information and Control | 2025年 / 21卷 / 02期
关键词
Deep learning; Encrypted traffic classification; LETformer; Sharpness-aware optimization; Spare relative position embedding;
D O I
10.24507/ijicic.21.02.359
中图分类号
学科分类号
摘要
Reliable encrypted traffic classification is fundamental for advancing cybersecurity and effectively managing exponentially growing data streams. The success of large language models in fields such as natural language processing demonstrates the feasibility of learning general paradigms from extensive corpora, making pre-trained encrypted traffic classification methods a preferred choice. However, attention-based pre-trained classification methods face two key constraints: the large number of neural parameters is unsuitable for low-computation environments like mobile devices and real-time classification scenarios, and there is a tendency to fall into local minima, leading to overfitting. We develop a shallow, lightweight Transformer model named LETformer. We utilize sharpness-aware optimization during pre-training to avoid local minima while capturing temporal features with relative positional embeddings and optimizing the classifier to maintain classification accuracy for downstream tasks. We evaluate our method on four datasets – USTC-TFC2016, ISCX-VPN2016, ICCXTOR, CICIOT2022. Despite having only 17.6 million parameters, LETformer achieves classification metrics comparable to those of methods with ten times the number of parameters. © 2025 ICIC International.
引用
收藏
页码:359 / 371
页数:12
相关论文
共 6 条
  • [1] Deep learning and pre-training technology for encrypted traffic classification: A comprehensive review
    Dong, Wenqi
    Yu, Jing
    Lin, Xinjie
    Gou, Gaopeng
    Xiong, Gang
    NEUROCOMPUTING, 2025, 617
  • [2] Multi-Level Pre-Training for Encrypted Network Traffic Classification
    Park, Jee-Tae
    Choi, Yang-Seo
    Cho, Bu-Seung
    Kim, Seung-Hae
    Kim, Myung-Sup
    IEEE ACCESS, 2025, 13 : 68643 - 68659
  • [3] Listen to Minority: Encrypted Traffic Classification for Class Imbalance with Contrastive Pre-Training
    Li, Xiang
    Guo, Juncheng
    Song, Qige
    Xie, Jiang
    Sang, Yafei
    Zhao, Shuyuan
    Zhang, Yongzheng
    2023 20TH ANNUAL IEEE INTERNATIONAL CONFERENCE ON SENSING, COMMUNICATION, AND NETWORKING, SECON, 2023,
  • [4] CBD: A Deep-Learning-Based Scheme for Encrypted Traffic Classification with a General Pre-Training Method
    Hu, Xinyi
    Gu, Chunxiang
    Chen, Yihang
    Wei, Fushan
    SENSORS, 2021, 21 (24)
  • [5] ET-BERT: A Contextualized Datagram Representation with Pre-training Transformers for Encrypted Traffic Classification
    Lin, Xinjie
    Gang Xiong
    Gou, Gaopeng
    Zhen Li
    Shi, Junzheng
    Jing Yu
    PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 633 - 642
  • [6] CETP: A novel semi-supervised framework based on contrastive pre-training for imbalanced encrypted traffic classification
    Lin, Xinjie
    He, Longtao
    Gou, Gaopeng
    Yu, Jing
    Guan, Zhong
    Li, Xiang
    Guo, Juncheng
    Xiong, Gang
    COMPUTERS & SECURITY, 2024, 143