Active contour model based on local Kullback-Leibler divergence for fast image segmentation

被引:25
|
作者
Yang, Chengxin [1 ]
Weng, Guirong [1 ]
Chen, Yiyang [1 ]
机构
[1] Soochow Univ, Sch Mech & Elect Engn, 178 Ganjiang Rd, Suzhou 215021, Jiangsu, Peoples R China
基金
中国国家自然科学基金;
关键词
Image segmentation; Kullback-Leibler divergence; Level set method; Inhomogeneous intensity; Robustness; LEVEL SET EVOLUTION; DRIVEN; ENERGY;
D O I
10.1016/j.engappai.2023.106472
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The inhomogeneity of image intensity and noise are the main factors that affect the segmentation results. To overcome these challenges, a new active contour model is designed based on level set method and Kullback-Leibler Divergence. First of all, a new regional measurement of information scale is applied to construct energy functional, instead of Euclidean distance. Test results demonstrate that the Kullback-Leibler Divergence achieves a truly better segmentation. Then, a new Heaviside function has been proposed in this paper, which gives rise to a faster zero-crossing slope than traditional function. In this sense, it can stimulate the evolution of the level set function faster and allocate internal and external energy reasonably. In addition, the activation function has also been improved, which makes itself fluctuates over a smaller range than former activation function. Experiments reveal that the 'Local Kullback-Leibler Divergency' (LKLD) model has desired segmentation results both on real-world and medical images. Also, it owns a better noise robustness and is not limited to position of initial contour.
引用
收藏
页数:16
相关论文
共 50 条
  • [21] Edge Detection Method of Binary Image Based on Kullback-Leibler Divergence
    Li, Jianjun
    Wei, Zhihui
    Zhang, Zhengjun
    PROCEEDINGS OF 2008 INTERNATIONAL PRE-OLYMPIC CONGRESS ON COMPUTER SCIENCE, VOL II: INFORMATION SCIENCE AND ENGINEERING, 2008, : 466 - 468
  • [22] Tomographic Image Reconstruction Based on Minimization of Symmetrized Kullback-Leibler Divergence
    Kasai, Ryosuke
    Yamaguchi, Yusaku
    Kojima, Takeshi
    Yoshinaga, Tetsuya
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2018, 2018
  • [23] Segmentation of SAR image based on Kullback-Leibler distance and regular tessellation
    Zhao Q.-H.
    Gao J.
    Zhao X.-M.
    Li Y.
    Kongzhi yu Juece/Control and Decision, 2018, 33 (10): : 1767 - 1774
  • [24] Kullback-Leibler divergence for Bayesian nonparametric model checking
    Al-Labadi, Luai
    Patel, Viskakh
    Vakiloroayaei, Kasra
    Wan, Clement
    JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2021, 50 (01) : 272 - 289
  • [25] Kullback-Leibler Divergence Metric Learning
    Ji, Shuyi
    Zhang, Zizhao
    Ying, Shihui
    Wang, Liejun
    Zhao, Xibin
    Gao, Yue
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (04) : 2047 - 2058
  • [26] Nonparametric Estimation of Kullback-Leibler Divergence
    Zhang, Zhiyi
    Grabchak, Michael
    NEURAL COMPUTATION, 2014, 26 (11) : 2570 - 2593
  • [27] Model parameter learning using Kullback-Leibler divergence
    Lin, Chungwei
    Marks, Tim K.
    Pajovic, Milutin
    Watanabe, Shinji
    Tung, Chih-kuan
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2018, 491 : 549 - 559
  • [28] Use of Kullback-Leibler divergence for forgetting
    Karny, Miroslav
    Andrysek, Josef
    INTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, 2009, 23 (10) : 961 - 975
  • [29] Kullback-Leibler divergence -based Improved Particle Filter
    Mansouri, Majdi
    Nounou, Hazem
    Nounou, Mohamed
    2014 11TH INTERNATIONAL MULTI-CONFERENCE ON SYSTEMS, SIGNALS & DEVICES (SSD), 2014,
  • [30] Source Resolvability with Kullback-Leibler Divergence
    Nomura, Ryo
    2018 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2018, : 2042 - 2046