A-DARTS: attention-guided differentiable architecture search for lung nodule classification

被引:5
作者
Hu, Liangxiao [1 ]
Liu, Qinglin [1 ]
Zhang, Jun [2 ]
Jiang, Feng [1 ,3 ]
Liu, Yang [1 ]
Zhang, Shengping [1 ,3 ]
机构
[1] Harbin Inst Technol, Sch Comp Sci & Technol, Harbin, Peoples R China
[2] Hefei Univ Technol, Sch Comp Sci & Informat Engn, Harbin, Peoples R China
[3] Peng Cheng Lab, Shenzhen, Peoples R China
基金
中国国家自然科学基金;
关键词
differentiable architecture search; lung nodule classification; attention mechanism; NEURAL-NETWORK;
D O I
10.1117/1.JEI.30.1.013012
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Lung cancer has caused the most cancer deaths in the past several years. Benign? malignant lung nodule classification is vital in lung nodule detection, which can help early diagnosis of lung cancer. Most existing works extract the features of chest CT images using the well-designed networks, which require substantial effort of experts. To automate the manual process of network design, we propose an attention-guided differentiable architecture search (A-DARTS) method, which directly searches for the optimal network on chest CT images. In addition, A-DARTS utilizes an attention mechanism to alleviate the effect of the initialization-sensitive nature of the searched network while enhancing the feature presentation ability. Extensive experiments on the Lung Image Database Consortium image collection (LIDC-IDRI) benchmark dataset show that the proposed method achieves a lung nodule classification accuracy of 92.93%, which is superior to the stateof-the-art methods. (C) 2021 SPIE and IS&T [DOI: 10.1117/1.JEI.30.1.013012]
引用
收藏
页数:11
相关论文
共 41 条
[31]   An improved CNN-based architecture for automatic lung nodule classification [J].
Mahmood, Sozan Abdullah ;
Ahmed, Hunar Abubakir .
MEDICAL & BIOLOGICAL ENGINEERING & COMPUTING, 2022, 60 (07) :1977-1986
[32]   Adaptive weighted aggregation in Group Improvised Harmony Search for lung nodule classification [J].
Kar, Subhajit ;
Das Sharma, Kaushik ;
Maitra, Madhubanti .
JOURNAL OF EXPERIMENTAL & THEORETICAL ARTIFICIAL INTELLIGENCE, 2020, 32 (02) :219-242
[33]   Neural architecture search combined with efficient attention for hyperspectral image classification [J].
Chen, Haisong ;
Zhang, Kang ;
Lu, Haoran ;
Wang, Aili ;
Wu, Haibin .
CHINESE JOURNAL OF LIQUID CRYSTALS AND DISPLAYS, 2025, 40 (04) :630-641
[34]   Attention-guided deep framework for polyp localization and subsequent classification via polyp local and Siamese feature fusion [J].
Sasmal, Pradipta ;
Panigrahi, Susant Kumar ;
Panda, Swarna Laxmi ;
Bhuyan, M. K. .
MEDICAL & BIOLOGICAL ENGINEERING & COMPUTING, 2025,
[35]   ARM-Net: Attention-guided residual multiscale CNN for multiclass brain tumor classification using MR images [J].
Dutta, Tapas Kumar ;
Nayak, Deepak Ranjan ;
Zhang, Yu-Dong .
BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2024, 87
[36]   AMFFNet: attention-guided multi-level feature fusion network for land cover classification of remote sensing images [J].
Tang, Bochuan ;
Tuerxun, Palidan ;
Qi, Ranran ;
Yang, Guangqi ;
Qian, Yurong .
JOURNAL OF APPLIED REMOTE SENSING, 2023, 17 (02)
[37]   An Improved Skin Lesion Classification Using a Hybrid Approach with Active Contour Snake Model and Lightweight Attention-Guided Capsule Networks [J].
Behara, Kavita ;
Bhero, Ernest ;
Agee, John Terhile .
DIAGNOSTICS, 2024, 14 (06)
[38]   Dense image-mask attention-guided transformer network for jaw lesions classification and segmentation in dental cone-beam computed tomography images [J].
Li, Xiang ;
Liu, Wei ;
Tang, Wei ;
Guo, Jixiang .
APPLIED INTELLIGENCE, 2025, 55 (06)
[39]   EL-NAS: Efficient Lightweight Attention Cross-Domain Architecture Search for Hyperspectral Image Classification [J].
Wang, Jianing ;
Hu, Jinyu ;
Liu, Yichen ;
Hua, Zheng ;
Hao, Shengjia ;
Yao, Yuqiong .
REMOTE SENSING, 2023, 15 (19)
[40]   MSA-Net: multiple self-attention mechanism for 3D lung nodule classification in CT images [J].
Pan, Jiating ;
Liang, Lishi ;
Sun, Peng ;
Liang, Yongbo ;
Zhu, Jianming ;
Chen, Zhencheng .
BMC MEDICAL IMAGING, 2025, 25 (01)