Large Scale Bayesian Inference and Experimental Design for Sparse Linear Models

被引:40
作者
Seeger, Matthias W. [1 ]
Nickisch, Hannes [2 ]
机构
[1] Ecole Polytech Fed Lausanne, Sch Comp & Commun Sci, CH-1015 Lausanne, Switzerland
[2] Max Planck Inst Biol Cybernet, D-72076 Tubingen, Germany
来源
SIAM JOURNAL ON IMAGING SCIENCES | 2011年 / 4卷 / 01期
关键词
sparse linear model; sparsity prior; experimental design; sampling optimization; image acquisition; variational approximate inference; Bayesian statistics; compressive sensing; sparse reconstruction; magnetic resonance imaging; MAXIMUM-LIKELIHOOD; SELECTION; ALGORITHMS; REGRESSION; ROBUST;
D O I
10.1137/090758775
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Many problems of low-level computer vision and image processing, such as denoising, deconvolution, tomographic reconstruction or superresolution, can be addressed by maximizing the posterior distribution of a sparse linear model (SLM). We show how higher-order Bayesian decision-making problems, such as optimizing image acquisition in magnetic resonance scanners, can be addressed by querying the SLM posterior covariance, unrelated to the density's mode. We propose a scalable algorithmic framework, with which SLM posteriors over full, high-resolution images can be approximated for the first time, solving a variational optimization problem which is convex if and only if posterior mode finding is convex. These methods successfully drive the optimization of sampling trajectories for real-world magnetic resonance imaging through Bayesian experimental design, which has not been attempted before. Our methodology provides new insight into similarities and differences between sparse reconstruction and approximate Bayesian inference, and has important implications for compressive sensing of real-world images. Parts of this work have been presented at conferences [M. Seeger, H. Nickisch, R. Pohmann, and B. Scholkopf, in Advances in Neural Information Processing Systems 21, D. Koller, D. Schuurmans, Y. Bengio, and L. Bottou, eds., Curran Associates, Red Hook, NY, 2009, pp. 1441-1448; H. Nickisch and M. Seeger, in Proceedings of the 26th International Conference on Machine Learning, L. Bottou and M. Littman, eds., Omni Press, Madison, WI, 2009, pp. 761-768].
引用
收藏
页码:166 / 199
页数:34
相关论文
共 48 条
  • [1] [Anonymous], 2003, P 9 INT WORKSH ART I
  • [2] [Anonymous], P 25 INT C MACH LEAR
  • [3] [Anonymous], 2009, P 26 ANN INT C MACH
  • [4] [Anonymous], 2008, Neural Information Processing Systems (NIPS)
  • [5] [Anonymous], 2006, Journal of the Royal Statistical Society, Series B
  • [6] [Anonymous], 2007, P 24 INT C MACHINE L, DOI [DOI 10.1145/1273496.1273544, 10.1145/1273496.1273544]
  • [7] Attias H, 2000, ADV NEUR IN, V12, P209
  • [8] Computation of large invariant subspaces using polynomial filtered Lanczos iterations with applications in density functional theory
    Bekas, C.
    Kokiopoulou, E.
    Saad, Yousef
    [J]. SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2008, 30 (01) : 397 - 418
  • [9] A new TwIST: Two-step iterative shrinkage/thresholding algorithms for image restoration
    Bioucas-Dias, Jose M.
    Figueiredo, Mario A. T.
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2007, 16 (12) : 2992 - 3004
  • [10] Boyd S., 2004, CONVEX OPTIMIZATION, VFirst, DOI DOI 10.1017/CBO9780511804441