Deep Learning and Handcrafted Method Fusion: Higher Diagnostic Accuracy for Melanoma Dermoscopy Images

被引:128
|
作者
Hagerty, Jason R. [1 ]
Stanley, R. Joe [2 ]
Almubarak, Haidar A. [2 ]
Lama, Norsang [2 ]
Kasmi, Reda [3 ]
Guo, Peng [2 ]
Drugge, Rhett J. [4 ]
Rabinovitz, Harold S. [5 ]
Oliviero, Margaret [5 ]
Stoecker, William V. [1 ]
机构
[1] S&A Technol, Rolla, MO 65401 USA
[2] Missouri Univ Sci & Technol, Rolla, MO 65209 USA
[3] Univ Bejaia, Bejaia 06000, Algeria
[4] Sheard & Drugge, Stamford, CT 06902 USA
[5] Plantation Skin & Canc Associates, Plantation, FL 33324 USA
基金
美国国家卫生研究院;
关键词
Melanoma; dermoscopy; deep learning; classifier; transfer learning; LESION SEGMENTATION; SKIN-LESIONS; CLASSIFICATION; ALGORITHMS; CHALLENGE; TEXTURE; AREAS;
D O I
10.1109/JBHI.2019.2891049
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper presents an approach that combines conventional image processing with deep learning by fusing the features from the individual techniques. We hypothesize that the two techniques, with different error profiles, are synergistic. The conventional image processing arm uses three handcrafted biologically inspired image processing modules and one clinical information module. The image processing modules detect lesion features comparable to clinical dermoscopy information-atypical pigment network, color distribution, and blood vessels. The clinical module includes information submitted to the pathologist-patient age, gender, lesion location, size, and patient history. The deep learning arm utilizes knowledge transfer via a ResNet-50 network that is repurposed to predict the probability of melanoma classification. The classification scores of each individual module from both processing arms are then ensembled utilizing logistic regression to predict an overall melanoma probability. Using cross-validated results of melanoma classification measured by area under the receiver operator characteristic curve (AUC), classification accuracy of 0.94 was obtained for the fusion technique. In comparison, the ResNet-50 deep learning based classifier alone yields an AUC of 0.87 and conventional image processing based classifier yields an AUC of 0.90. Further study of fusion of conventional image processing techniques and deep learning is warranted.
引用
收藏
页码:1385 / 1391
页数:7
相关论文
共 50 条
  • [31] Melanoma detection from dermoscopy images using Nasnet Mobile with Transfer Learning
    Cakmak, Mustafa
    Tenekeci, Mehmet Emin
    29TH IEEE CONFERENCE ON SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS (SIU 2021), 2021,
  • [33] Deep metric attention learning for skin lesion classification in dermoscopy images
    He, Xiaoyu
    Wang, Yong
    Zhao, Shuang
    Yao, Chunli
    COMPLEX & INTELLIGENT SYSTEMS, 2022, 8 (02) : 1487 - 1504
  • [34] Skin Lesion Classification in Dermoscopy Images Using Synergic Deep Learning
    Zhang, Jianpeng
    Xie, Yutong
    Wu, Qi
    Xia, Yong
    MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION - MICCAI 2018, PT II, 2018, 11071 : 12 - 20
  • [35] Deep Learning Method for Melanoma Discrimination Using Blood Flow Distribution Images
    Akiguchi, Shunsuke
    Kyoden, Tomoaki
    Tajiri, Tomoki
    Andoh, Tsugunobu
    Hachiga, Tadashi
    IEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING, 2021, 16 (05) : 813 - 815
  • [36] Development of a deep learning method for improving diagnostic accuracy for uterine sarcoma cases
    Yusuke Toyohara
    Kenbun Sone
    Katsuhiko Noda
    Kaname Yoshida
    Ryo Kurokawa
    Tomoya Tanishima
    Shimpei Kato
    Shohei Inui
    Yudai Nakai
    Masanori Ishida
    Wataru Gonoi
    Saki Tanimoto
    Yu Takahashi
    Futaba Inoue
    Asako Kukita
    Yoshiko Kawata
    Ayumi Taguchi
    Akiko Furusawa
    Yuichiro Miyamoto
    Takehiro Tsukazaki
    Michihiro Tanikawa
    Takayuki Iriyama
    Mayuyo Mori-Uchino
    Tetsushi Tsuruga
    Katsutoshi Oda
    Toshiharu Yasugi
    Kimihiro Takechi
    Osamu Abe
    Yutaka Osuga
    Scientific Reports, 12
  • [37] Development of a deep learning method for improving diagnostic accuracy for uterine sarcoma cases
    Toyohara, Yusuke
    Sone, Kenbun
    Noda, Katsuhiko
    Yoshida, Kaname
    Kurokawa, Ryo
    Tanishima, Tomoya
    Kato, Shimpei
    Inui, Shohei
    Nakai, Yudai
    Ishida, Masanori
    Gonoi, Wataru
    Tanimoto, Saki
    Takahashi, Yu
    Inoue, Futaba
    Kukita, Asako
    Kawata, Yoshiko
    Taguchi, Ayumi
    Furusawa, Akiko
    Miyamoto, Yuichiro
    Tsukazaki, Takehiro
    Tanikawa, Michihiro
    Iriyama, Takayuki
    Mori-Uchino, Mayuyo
    Tsuruga, Tetsushi
    Oda, Katsutoshi
    Yasugi, Toshiharu
    Takechi, Kimihiro
    Abe, Osamu
    Osuga, Yutaka
    SCIENTIFIC REPORTS, 2022, 12 (01)
  • [38] Improving mammography lesion classification by optimal fusion of handcrafted and deep transfer learning features
    Jones, Meredith A.
    Faiz, Rowzat
    Qiu, Yuchen
    Zheng, Bin
    PHYSICS IN MEDICINE AND BIOLOGY, 2022, 67 (05):
  • [39] Automatic Colorization of Greyscale Images Using Deep Learning Fusion Technique and Accuracy Level Analysis
    Akarawita, Isurie
    Abeykoon, A. M. Harsha S.
    2024 INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS AND MECHATRONICS, ICARM 2024, 2024, : 625 - 630
  • [40] Epileptic Seizures Detection in EEG Signals Using Fusion Handcrafted and Deep Learning Features
    Malekzadeh, Anis
    Zare, Assef
    Yaghoobi, Mahdi
    Kobravi, Hamid-Reza
    Alizadehsani, Roohallah
    SENSORS, 2021, 21 (22)