Training a thin and shallow lane detection network with self-knowledge distillation

被引:2
|
作者
Dai, Xuerui [1 ]
Yuan, Xue [1 ]
Wei, Xueye [1 ]
机构
[1] Beijing Jiaotong Univ, Sch Elect & Informat Engn, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
lane detection; deep learning; self-knowledge distillation; ROAD; VISION;
D O I
10.1117/1.JEI.30.1.013004
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
With modern science and technology development, vehicles are equipped with intelligent driver assistant systems, of which lane detection is a key function. These complex detection structures (either wide or deep) are investigated to boost the accuracy and overcome the challenges in complicated scenarios. However, the computation and memory storage cost will increase sharply, and the response time will also increase. For resource-constrained devices, lane detection networks with a low cost and short inference time should be implemented. To get more accurate lane detection results, the large (deep and wide) detection structure is framed for high-dimensional and highly robust features, and deep supervision loss is applied on different resolutions and stages. Despite the high-precision advantage, the large detection network cannot be used for embedded devices directly because of the demand for memory and computation. To make the network thinner and lighter, a general training strategy, called self-knowledge distillation (SKD), is proposed. It is different from classical knowledge distillation; there are no independent teacher-student networks, and the knowledge is distilled itself. To evaluate more comprehensively and precisely, a new lane data set is collected. The Caltech Lane date set and TuSimple lane data set are also used for evaluation. Experiments further prove that a small student network and large teacher network have a similar detection accuracy via SKD, and the student network has a shorter inference time and lower memory usage. Thus it can be applied for resource-limited devices flexibly. ? 2021 SPIE and IS&T [DOI: 10.1117/1.JEI.30.1.013004]
引用
收藏
页数:19
相关论文
共 50 条
  • [1] A Lightweight Approach for Network Intrusion Detection based on Self-Knowledge Distillation
    Yang, Shuo
    Zheng, Xinran
    Xu, Zhengzhuo
    Wang, Xingjun
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 3000 - 3005
  • [2] Neighbor self-knowledge distillation
    Liang, Peng
    Zhang, Weiwei
    Wang, Junhuang
    Guo, Yufeng
    INFORMATION SCIENCES, 2024, 654
  • [3] Adaptive lightweight network construction method for Self-Knowledge Distillation
    Lu, Siyuan
    Zeng, Weiliang
    Li, Xueshi
    Ou, Jiajun
    NEUROCOMPUTING, 2025, 624
  • [4] ROBUST AND ACCURATE OBJECT DETECTION VIA SELF-KNOWLEDGE DISTILLATION
    Xu, Weipeng
    Chu, Pengzhi
    Xie, Renhao
    Xiao, Xiongziyan
    Huang, Hongcheng
    2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, : 91 - 95
  • [5] Self-knowledge distillation with dimensional history knowledge
    Wenke Huang
    Mang Ye
    Zekun Shi
    He Li
    Bo Du
    Science China Information Sciences, 2025, 68 (9)
  • [6] A Knowledge Distillation Network Combining Adversarial Training and Intermediate Feature Extraction for Lane Line Detection
    Zhu, Fenghua
    Chen, Yuanyuan
    2024 AUSTRALIAN & NEW ZEALAND CONTROL CONFERENCE, ANZCC, 2024, : 92 - 97
  • [7] Self-knowledge distillation via dropout
    Lee, Hyoje
    Park, Yeachan
    Seo, Hyun
    Kang, Myungjoo
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2023, 233
  • [8] Dual teachers for self-knowledge distillation
    Li, Zheng
    Li, Xiang
    Yang, Lingfeng
    Song, Renjie
    Yang, Jian
    Pan, Zhigeng
    PATTERN RECOGNITION, 2024, 151
  • [9] A Lightweight Convolution Network with Self-Knowledge Distillation for Hyperspectral Image Classification
    Xu, Hao
    Cao, Guo
    Deng, Lindiao
    Ding, Lanwei
    Xu, Ling
    Pan, Qikun
    Shang, Yanfeng
    FOURTEENTH INTERNATIONAL CONFERENCE ON GRAPHICS AND IMAGE PROCESSING, ICGIP 2022, 2022, 12705
  • [10] Sliding Cross Entropy for Self-Knowledge Distillation
    Lee, Hanbeen
    Kim, Jeongho
    Woo, Simon S.
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 1044 - 1053