Acral melanocytic lesion segmentation with a convolution neural network (U-Net)

被引:2
|
作者
Jaworek-Korjakowska, Joanna [1 ]
机构
[1] AGH Univ Sci & Technol, Dept Automat Control & Robot, Al A Mickiewicza 30, PL-30059 Krakow, Poland
来源
MEDICAL IMAGING 2019: COMPUTER-AIDED DIAGNOSIS | 2019年 / 10950卷
关键词
Acral melanoma; deep learning; U-Net architecture; skin cancer; segmentation; DERMOSCOPY IMAGES; CLASSIFICATION;
D O I
10.1117/12.2512804
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Melanocytic lesions of acral sites (ALM) are common, with an estimated prevalence of 28 - 36% in the USA. While the majority of these lesions are benign, differentiation from acral melanoma (AM) is often challenging. Much research has been done in segmenting and classifying skin moles located in acral volar areas. However, methods published to date cannot be easily extended to new skin regions because of different appearance and properties. In this paper, we propose a deep learning (U-Net) architecture to segment acral melonacytic lesions which is a necessary initial step for skin lesion pattern recognition, furthermore it is a prerequisite step to provide an accurate classification and diagnosis. The U-Net is one of the most promising deep learning solution for image segmentation and is built upon fully convolutional network. On the independent validation dataset including 210 dermoscopy images our implemented method showed high segmentation accuracy. For the U-Net convolutional neural network, an average DSC of 0.92, accuracy 0.94, sensitivity 0.91, and specificity 0.92 has been achieved. ALM due to small size and similarity to other local structures create enormous difficulties during the segmentation and assessment process. The use of advanced segmentation methods like deep learning models especially convolutional neural networks have the potential to improve the accuracy of advanced medical area segmentation.
引用
收藏
页数:7
相关论文
共 50 条
  • [1] Multiscale Attention U-Net for Skin Lesion Segmentation
    Alahmadi, Mohammad D.
    IEEE ACCESS, 2022, 10 : 59145 - 59154
  • [2] Modified U-NET Architecture for Segmentation of Skin Lesion
    Anand, Vatsala
    Gupta, Sheifali
    Koundal, Deepika
    Nayak, Soumya Ranjan
    Barsocchi, Paolo
    Bhoi, Akash Kumar
    SENSORS, 2022, 22 (03)
  • [3] A Convolutional Neural Network for Skin Lesion Segmentation Using Double U-Net Architecture
    Abid, Iqra
    Almakdi, Sultan
    Rahman, Hameedur
    Almulihi, Ahmed
    Alqahtani, Ali
    Rajab, Khairan
    Alqhatani, Abdulmajeed
    Shaikh, Asadullah
    INTELLIGENT AUTOMATION AND SOFT COMPUTING, 2022, 33 (03) : 1407 - 1421
  • [4] Fusion of U-Net and CNN model for segmentation and classification of skin lesion from dermoscopy images
    Anand, Vatsala
    Gupta, Sheifali
    Koundal, Deepika
    Singh, Karamjeet
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 213
  • [5] Fuzzy U-Net Neural Network Design for Image Segmentation
    Kirichev, Mark
    Slavov, Todor
    Momcheva, Galina
    CONTEMPORARY METHODS IN BIOINFORMATICS AND BIOMEDICINE AND THEIR APPLICATIONS, 2022, 374 : 177 - 184
  • [6] A U-Net Ensemble for breast lesion segmentation in DCE MRI
    Khaled, Roa'a
    Vidal, Joel
    Vilanova, Joan C.
    Marti, Robert
    COMPUTERS IN BIOLOGY AND MEDICINE, 2022, 140
  • [7] Towards improved U-Net for efficient skin lesion segmentation
    Nampalle, Kishore Babu
    Pundhir, Anshul
    Jupudi, Pushpamanjari Ramesh
    Raman, Balasubramanian
    MULTIMEDIA TOOLS AND APPLICATIONS, 2024, 83 (28) : 71665 - 71682
  • [8] RetU-Net: An Enhanced U-Net Architecture for Retinal Lesion Segmentation
    Sundar, Sumod
    Sumathy, S.
    INTERNATIONAL JOURNAL ON ARTIFICIAL INTELLIGENCE TOOLS, 2023, 32 (04)
  • [9] A Method for Retina Segmentation by Means of U-Net Network
    Santone, Antonella
    De Vivo, Rosamaria
    Recchia, Laura
    Cesarelli, Mario
    Mercaldo, Francesco
    ELECTRONICS, 2024, 13 (22)
  • [10] A Method for Polyp Segmentation Through U-Net Network
    Santone, Antonella
    Cesarelli, Mario
    Mercaldo, Francesco
    BIOENGINEERING-BASEL, 2025, 12 (03):