Flexibly regularized mixture models and application to image segmentation

被引:9
作者
Vacher, Jonathan [1 ,4 ]
Launay, Claire [1 ]
Coen-Cagli, Ruben [1 ,2 ,3 ]
机构
[1] Albert Einstein Coll Med, Dept Syst & Computat Biol, 1300 Morris Pk Ave, Bronx, NY 10461 USA
[2] Albert Einstein Coll Med, Dominick P Purpura Dept Neurosci, 1300 Morris Pk Ave, Bronx, NY 10461 USA
[3] Albert Einstein Coll Med, Dept Ophthalmol & Visual Sci, 1300 Morris Pk Ave, Bronx, NY 10461 USA
[4] PSL Univ, Ecole Normale Super, Dept Etud Cognit, Lab Syst Perceptif, 24 Rue Lhomond,Batiment Jaures,2Eme Etage, F-75005 Paris, France
关键词
Unsupervised learning; Mixture models; Graphical model; Factor graph; Image segmentation; Convolutional neural networks; CONVERGENCE;
D O I
10.1016/j.neunet.2022.02.010
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Probabilistic finite mixture models are widely used for unsupervised clustering. These models can often be improved by adapting them to the topology of the data. For instance, in order to classify spatially adjacent data points similarly, it is common to introduce a Laplacian constraint on the posterior probability that each data point belongs to a class. Alternatively, the mixing probabilities can be treated as free parameters, while assuming Gauss-Markov or more complex priors to regularize those mixing probabilities. However, these approaches are constrained by the shape of the prior and often lead to complicated or intractable inference. Here, we propose a new parametrization of the Dirichlet distribution to flexibly regularize the mixing probabilities of over-parametrized mixture distributions. Using the Expectation-Maximization algorithm, we show that our approach allows us to define any linear update rule for the mixing probabilities, including spatial smoothing regularization as a special case. We then show that this flexible design can be extended to share class information between multiple mixture models. We apply our algorithm to artificial and natural image segmentation tasks, and we provide quantitative and qualitative comparison of the performance of Gaussian and Student-t mixtures on the Berkeley Segmentation Dataset. We also demonstrate how to propagate class information across the layers of deep convolutional neural networks in a probabilistically optimal way, suggesting a new interpretation for feedback signals in biological visual systems. Our flexible approach can be easily generalized to adapt probabilistic mixture models to arbitrary data topologies. (c) 2022 Elsevier Ltd. All rights reserved.
引用
收藏
页码:107 / 123
页数:17
相关论文
共 65 条
[1]  
[Anonymous], 2009, NATURAL IMAGE STAT P
[2]  
[Anonymous], 2011, Advances in Neural Information Processing Systems
[3]   Contour Detection and Hierarchical Image Segmentation [J].
Arbelaez, Pablo ;
Maire, Michael ;
Fowlkes, Charless ;
Malik, Jitendra .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2011, 33 (05) :898-916
[4]   SegNet: A Deep Convolutional Encoder-Decoder Architecture for Image Segmentation [J].
Badrinarayanan, Vijay ;
Kendall, Alex ;
Cipolla, Roberto .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2017, 39 (12) :2481-2495
[5]  
Blei DM, 2011, J MACH LEARN RES, V12, P2461
[6]   Model-based clustering of high-dimensional data: A review [J].
Bouveyron, Charles ;
Brunet-Saumard, Camille .
COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2014, 71 :52-78
[7]  
Boyd S., 2004, CONVEX OPTIMIZATION
[8]   Fast approximate energy minimization via graph cuts [J].
Boykov, Y ;
Veksler, O ;
Zabih, R .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2001, 23 (11) :1222-1239
[9]  
BOYLES RA, 1983, J ROY STAT SOC B MET, V45, P47
[10]   DeepLab: Semantic Image Segmentation with Deep Convolutional Nets, Atrous Convolution, and Fully Connected CRFs [J].
Chen, Liang-Chieh ;
Papandreou, George ;
Kokkinos, Iasonas ;
Murphy, Kevin ;
Yuille, Alan L. .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2018, 40 (04) :834-848