Jean-David Jacques (Potsdam)
Sven Wang (HU Berlin)
We consider the problem of generating random samples from high-dimensional posterior distributions. We will discuss both (i) conditions under which diffusion-based MCMC algorithms mix in polynomial-time (based on https://arxiv.org/pdf/2009.05298.pdf) as well as (ii) situations in which `cold-start' MCMC suffers from an exponentially long mixing time (based on https://arxiv.org/pdf/2209.02001.pdf). We will focus on the setting of non-linear inverse regression models. Our positive results on polynomial-time mixing are derived under local `gradient stability' assumptions on the forward map, which can be verified for a range of well-known non-linear inverse problems involving elliptic PDE, as well as under the assumption that a good initializer is available. Our negative results on exponentially long mixing times hold for `cold-start' MCMC. We show that there exist non-linear regression models in which the posterior distribution is unimodal, but there exists a so-called `free entropy barrier', which local Markov chains take an exponentially long time to traverse.