News

Lukas Bruder presents his Semester thesis on "Stochastic variational Bayesian inference for inverse problems"


Abstract:

The present thesis is concerned with the topic of stochastic variational inference and its application to model-based Bayesian inverse problems. Variational inference, like Markov chain Monte Carlo (MCMC), is a method to evaluate intractable, complex prob- ability distributions. In Bayesian statistics, these are called posterior distributions and incorporate the information about unknown quantities as random variables. Assessing this information is referred to as inference. As opposed to MCMC, variational inference is an approximation method and based on the minimization of the Kullback-Leibler (KL) divergence measure between the posterior distribution and a posited, parameterized approximating distribution. The optimization objective is to find an optimal set of pa- rameters such that the variational approximation is closest to the posterior distribution in terms of minimal KL divergence. Recent developments in variational inference research have resulted in a stochastic optimization framework, enabling more flexible probabilistic models and more accurate posterior approximations. We review this stochastic approach to variational inference in detail and investigate its potential with respect to applications to posterior distributions arising from Bayesian inverse problem modeling. After establishing a theoretical basis, a unified stochastic optimization procedure is developed and its feasibility in high- dimensions demonstrated. Motivated by X-ray imaging and computed tomography, the problem of image reconstruction is examined and the application of the stochastic variational inference framework shown. For comparison, the same numerical example is solved using two standard MCMC algorithms, indicating that stochastic variational inference performs well in high dimensions.