Department of Mathematical Sciences at Chalmers University of Technology and University of Gothenburg
Sequential algorithms such as sequential importance sampling (SIS) and sequential Monte Carlo (SMC) have proven fundamental in Bayesian inference. However, probabilistic models often do not admit a readily available likelihood function or one that is computationally cheap to approximate.
In the last 20 years, simulation-based approaches have flourished to bypass the likelihood intractability by implicitly making use of it via model simulations. The most studied class of simulation-based inference methods is arguably approximate Bayesian computation (ABC). For ABC, sequential Monte Carlo (SMC-ABC) is the state-of-art sampler. However, since the ABC paradigm is intrinsically wasteful, sequential ABC schemes can benefit from well-targeted proposal samplers that efficiently avoid improbable parameter regions.
We construct novel proposal samplers that are conditional to summary statistics of the data. In a sense, the proposed parameters are "guided" to rapidly reach regions of the posterior surface that are compatible with the observed data. This speeds up the convergence of these sequential samplers, thus reducing the computational effort, while preserving the accuracy in the inference. We provide a variety of guided samplers easing inference for challenging case-studies, including multimodal posteriors, highly correlated posteriors, hierarchical models with high-dimensional summary statistics.