CephaNN: A Multi-Head Attention Network for Cephalometric Landmark Detection

被引:27
作者
Qian, Jiahong [1 ]
Luo, Weizhi [1 ]
Cheng, Ming [1 ]
Tao, Yubo [1 ,2 ]
Lin, Jun [3 ]
Lin, Hai [1 ,2 ]
机构
[1] Zhejiang Univ, State Key Lab CAD&CG, Hangzhou 310058, Peoples R China
[2] Zhejiang Univ, Innovat Ctr Minimally Invas Tech & Device, Hangzhou 310058, Peoples R China
[3] Zhejiang Univ, Coll Med, Affiliated Hosp 1, Dept Stomatol, Hangzhou 310058, Peoples R China
基金
中国国家自然科学基金;
关键词
Heating systems; Neural networks; Kernel; Feature extraction; Annotations; Two dimensional displays; Deep learning; Cephalometric landmark detection; multi-head attention; neural network; intermediate supervision; region enhance;
D O I
10.1109/ACCESS.2020.3002939
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Cephalometric landmark detection is a crucial step in orthodontic and orthognathic treatments. To detect cephalometric landmarks accurately, we propose a novel multi-head attention neural network (CephaNN). CephaNN is an end-to-end network based on the heatmaps of annotated landmarks, and it consists of two parts, the multi-head part and the attention part. In the multi-head part, we adopt multi-head subnets to gain comprehensive knowledge of various subspaces of a cephalogram. The intermediate supervision is applied to accelerate the convergence. Based on the feature maps learned from the multi-head Part, the attention part applies the multi-attention mechanism to obtain a refined detection. For solving the class imbalance problem, we propose a region enhancing (RE) loss, to enhance the efficient regions on the regressed heatmaps. Experiments in the benchmark dataset demonstrate that CephaNN is state-of-the-art with the detection accuracy of 87.61% in the clinically accepted 2.0-mm range. Furthermore, CephaNN is efficient in classifying the anatomical types and robust in a real application on a 75-landmark dataset.
引用
收藏
页码:112633 / 112641
页数:9
相关论文
共 50 条
  • [41] Identifying piRNA targets on mRNAs in C. elegans using a deep multi-head attention network
    Tzu-Hsien Yang
    Sheng-Cian Shiue
    Kuan-Yu Chen
    Yan-Yuan Tseng
    Wei-Sheng Wu
    BMC Bioinformatics, 22
  • [42] Identifying piRNA targets on mRNAs in C. elegans using a deep multi-head attention network
    Yang, Tzu-Hsien
    Shiue, Sheng-Cian
    Chen, Kuan-Yu
    Tseng, Yan-Yuan
    Wu, Wei-Sheng
    BMC BIOINFORMATICS, 2021, 22 (01)
  • [43] Taxi Demand Prediction based on LSTM with Residuals and Multi-head Attention
    Hsu, Chih-Jung
    Chen, Hung-Hsuan
    PROCEEDINGS OF THE 6TH INTERNATIONAL CONFERENCE ON VEHICLE TECHNOLOGY AND INTELLIGENT TRANSPORT SYSTEMS (VEHITS), 2020, : 268 - 275
  • [44] Recognizing facial expressions based on pyramid multi-head grid and spatial attention network
    Zhang, Jianyang
    Wang, Wei
    Li, Xiangyu
    Han, Yanjiang
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2024, 244
  • [45] RMAN: Relational multi-head attention neural network for joint extraction of entities and relations
    Taiqu Lai
    Lianglun Cheng
    Depei Wang
    Haiming Ye
    Weiwen Zhang
    Applied Intelligence, 2022, 52 : 3132 - 3142
  • [46] Acoustic Scene Analysis with Multi-head Attention Networks
    Wang, Weimin
    Wang, Weiran
    Sun, Ming
    Wang, Chao
    INTERSPEECH 2020, 2020, : 1191 - 1195
  • [47] MAGAN: multi-head attention generative adversarial network for remote sensing image classification
    Ke Li
    Zhonghua Luo
    Keyong Shen
    Jie Lu
    Zhijun Li
    Signal, Image and Video Processing, 2025, 19 (7)
  • [48] RMAN: Relational multi-head attention neural network for joint extraction of entities and relations
    Lai, Taiqu
    Cheng, Lianglun
    Wang, Depei
    Ye, Haiming
    Zhang, Weiwen
    APPLIED INTELLIGENCE, 2022, 52 (03) : 3132 - 3142
  • [49] Research on Transportation Mode Recognition Based on Multi-Head Attention Temporal Convolutional Network
    Cheng, Shuyu
    Liu, Yingan
    SENSORS, 2023, 23 (07)
  • [50] Memory network with hierarchical multi-head attention for aspect-based sentiment analysis
    Chen, Yuzhong
    Zhuang, Tianhao
    Guo, Kun
    APPLIED INTELLIGENCE, 2021, 51 (07) : 4287 - 4304