StackNet-DenVIS: a multi-layer perceptron stacked ensembling approach for COVID-19 detection using X-ray images

被引:21
作者
Autee, Pratik [1 ]
Bagwe, Sagar [1 ]
Shah, Vimal [1 ,2 ]
Srivastava, Kriti [1 ]
机构
[1] Dwarkadas J Sanghvi Coll Engn, Dept Comp Engn, Mumbai, Maharashtra, India
[2] A-602 Venkatesh Pooja,Balaji Complex,150 Feet Rd, Thana 401101, Maharashtra, India
关键词
Covid-19; Stacked generalization; Transfer learning; Deep neural networks; Generative adversarial networks; Image segmentation; LUNG SEGMENTATION;
D O I
10.1007/s13246-020-00952-6
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
The highly contagious nature of Coronavirus disease 2019 (Covid-19) resulted in a global pandemic. Due to the relatively slow and taxing nature of conventional testing for Covid-19, a faster method needs to be in place. The current researches have suggested that visible irregularities found in the chest X-ray of Covid-19 positive patients are indicative of the presence of the disease. Hence, Deep Learning and Image Classification techniques can be employed to learn from these irregularities, and classify accordingly with high accuracy. This research presents an approach to create a classifier model named StackNet-DenVIS which is designed to act as a screening process before conducting the existing swab tests. Using a novel approach, which incorporates Transfer Learning and Stacked Generalization, the model aims to lower the False Negative rate of classification compensating for the 30% False Negative rate of the swab tests. A dataset gathered from multiple reliable sources consisting of 9953 Chest X-rays (868 Covid and 9085 Non-Covid) was used. Also, this research demonstrates handling data imbalance using various techniques involving Generative Adversarial Networks and sampling techniques. The accuracy, sensitivity, and specificity obtained on our proposed model were 95.07%, 99.40% and 94.61% respectively. To the best of our knowledge, the combination of accuracy and false negative rate obtained by this paper outperforms the current implementations. We must also highlight that our proposed architecture also considers other types of viral pneumonia. Given the unprecedented sensitivity of our model we are optimistic it contributes to a better Covid-19 detection.
引用
收藏
页码:1399 / 1414
页数:16
相关论文
共 39 条
  • [21] Maguolo G., 2020, A critic evaluation of methods for covid-19 automatic detection from x-ray images
  • [22] A Generic Approach to Pathological Lung Segmentation
    Mansoor, Awais
    Bagci, Ulas
    Xu, Ziyue
    Foster, Brent
    Olivier, Kenneth N.
    Elinoff, Jason M.
    Suffredini, Anthony F.
    Udupa, Jayaram K.
    Mollura, Daniel J.
    [J]. IEEE TRANSACTIONS ON MEDICAL IMAGING, 2014, 33 (12) : 2293 - 2310
  • [23] Automatic detection of coronavirus disease (COVID-19) using X-ray images and deep convolutional neural networks
    Narin, Ali
    Kaya, Ceren
    Pamuk, Ziynet
    [J]. PATTERN ANALYSIS AND APPLICATIONS, 2021, 24 (03) : 1207 - 1220
  • [24] Odena CO, 2016, ARXIV161009585
  • [25] Rajpurkar P, 2017, Arxiv, DOI [arXiv:1711.05225, 10.48550/arXiv.1711.05225]
  • [26] U-Net: Convolutional Networks for Biomedical Image Segmentation
    Ronneberger, Olaf
    Fischer, Philipp
    Brox, Thomas
    [J]. MEDICAL IMAGE COMPUTING AND COMPUTER-ASSISTED INTERVENTION, PT III, 2015, 9351 : 234 - 241
  • [27] Detection of Coronavirus Disease (COVID-19) based on Deep Features and Support Vector Machine
    Sethy, Prabira Kumar
    Behera, Santi Kumari
    Ratha, Pradyumna Kumar
    Biswas, Preesat
    [J]. INTERNATIONAL JOURNAL OF MATHEMATICAL ENGINEERING AND MANAGEMENT SCIENCES, 2020, 5 (04) : 643 - 651
  • [28] Deep Convolutional Neural Networks for Computer-Aided Detection: CNN Architectures, Dataset Characteristics and Transfer Learning
    Shin, Hoo-Chang
    Roth, Holger R.
    Gao, Mingchen
    Lu, Le
    Xu, Ziyue
    Nogues, Isabella
    Yao, Jianhua
    Mollura, Daniel
    Summers, Ronald M.
    [J]. IEEE TRANSACTIONS ON MEDICAL IMAGING, 2016, 35 (05) : 1285 - 1298
  • [29] Simon M., 2016, ARXIV161201452
  • [30] Smith L. N., 2018, ARXIV180309820