Lightweight Network Traffic Classification Model Based on Knowledge Distillation

被引:4
|
作者
Wu, Yanhui [1 ]
Zhang, Meng [1 ]
机构
[1] Jilin Univ, Coll Comp Sci & Technol, Changchun 130012, Peoples R China
来源
WEB INFORMATION SYSTEMS ENGINEERING - WISE 2021, PT II | 2021年 / 13081卷
关键词
Network traffic classification; Knowledge distillation; Deep learning; Long short-term memory; NEURAL-NETWORKS;
D O I
10.1007/978-3-030-91560-5_8
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep learning have been extensively applied to network traffic classification. This technology reduce much manual design and archive high accuracy in complex and highly variable networks. The existing deep learning approaches usually require abundant space and computing resources to improve the accuracy. However, for on-line network traffic classifications, increments in latency and instability incurred by the high costs of the model make it unsuitable for the case. A promising tool to deal with the challenge is knowledge distillation, which produces spaceand-time efficient models from high-accurate large models. In this paper, we propose a light-weight encrypted traffic classification model based on knowledge distillation. We adopt the LSTM structure and apply knowledge distillation to it. The distilling loss is focal loss, which can effectively solve the imbalance in the number of samples and different degrees of difficulty in classification. To enhance the learning efficiency, we design an adaptive temperature function to soften the labels at each training stage. Experiments show that compared with the teacher model, the recognition speed of the student model is increased by 72%, the accuracy of the student model decreased by only 0.45% to 99.52%. Our model achieves both high accuracy and low latency for on-line encrypted traffic classification compared with the state of the art.
引用
收藏
页码:107 / 121
页数:15
相关论文
共 50 条
  • [1] Lightweight Road Traffic Sign Identification Neural Network Based on Knowledge Distillation
    Ge, Yiyuan
    Yu, Mingxin
    Computer Engineering and Applications, 2024, 60 (19) : 110 - 119
  • [2] Lightweight detection network for bridge defects based on model pruning and knowledge distillation
    Guan, Bin
    Li, Junjie
    STRUCTURES, 2024, 62
  • [3] A lightweight crack segmentation network based on knowledge distillation
    Wang, Wenjun
    Su, Chao
    Han, Guohui
    Zhang, Heng
    JOURNAL OF BUILDING ENGINEERING, 2023, 76
  • [4] Lightweight convolutional neural network with knowledge distillation for cervical cells classification
    Chen, Wen
    Gao, Liang
    Li, Xinyu
    Shen, Weiming
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2022, 71
  • [5] Lightweight remote sensing scene classification based on knowledge distillation
    Zhang, Chong-Yang
    Wang, Bin
    JOURNAL OF INFRARED AND MILLIMETER WAVES, 2024, 43 (05) : 684 - 695
  • [6] Spatial-temporal knowledge distillation for lightweight network traffic anomaly detection
    Wang, Xintong
    Wang, Zixuan
    Wang, Enliang
    Sun, Zhixin
    COMPUTERS & SECURITY, 2024, 137
  • [7] SEDG-Yolov5: A Lightweight Traffic Sign Detection Model Based on Knowledge Distillation
    Zhao, Liang
    Wei, Zhengjie
    Li, Yanting
    Jin, Junwei
    Li, Xuan
    ELECTRONICS, 2023, 12 (02)
  • [8] A Lightweight Malware Detection Model Based on Knowledge Distillation
    Miao, Chunyu
    Kou, Liang
    Zhang, Jilin
    Dong, Guozhong
    MATHEMATICS, 2024, 12 (24)
  • [9] Yarn state detection based on lightweight network and knowledge distillation
    Ren G.
    Tu J.
    Li Y.
    Qiu Z.
    Shi W.
    Fangzhi Xuebao/Journal of Textile Research, 2023, 44 (09): : 205 - 212
  • [10] A Lightweight Convolution Network with Self-Knowledge Distillation for Hyperspectral Image Classification
    Xu, Hao
    Cao, Guo
    Deng, Lindiao
    Ding, Lanwei
    Xu, Ling
    Pan, Qikun
    Shang, Yanfeng
    FOURTEENTH INTERNATIONAL CONFERENCE ON GRAPHICS AND IMAGE PROCESSING, ICGIP 2022, 2022, 12705