In many inverse problems, the forward model “smooths” or filters the input parameters. As a result, observations of the model output can only inform certain functionals of the parameters, while complementary parts of the posterior are dominated by the prior. In high-dimensional problems, discovering this structure through direct application of MCMC can be computationally intractable.

We propose a multiscale decomposition of the inference problem that takes advantage of multiscale methods for simulating the systems under consideration—including, but not limited to, multiscale methods for the solution of certain PDEs. Multiscale methods can be interpreted as a means of identifying conditional independence structure, such that parameters of interest are conditionally independent of the observations given some intermediate coarse-scale quantities. Inference that exploits this structure, particularly when the relationship between scales is nonlinear, requires new approaches. Using tools from optimal transportation, we develop an approach for extracting a prior distribution on these coarse scale quantities (as a pushforward of the original prior) and for conditionally sampling the original parameters given the coarse-scale quantities. The resulting scheme couples conditional sampling of the parameters to a low-dimensional inference problem where typical MCMC methods can be applied. We illustrate our approach on problems having between 2 and 10000 parameters. Also, we discuss the relationship between this approach and other “subspace MCMC” methods.

**Keywords:** Bayesian Inference, Multiscale, Optimal Transport, Reduced order modeling

* This is joint work with Youssef Marzouk and Tarek Moselhy.*