Research Seminar at the Institute of Applied Statistics
January, 26th - Ritabrata ‘Rito’ Dutta, University of Warwick, UK (joint work with Lorenzo Pachhiardi and Sherman Khoo): Sampling Likelihood-Free ‘generalized' posteriors with Stochastic Gradient MCMC
zoom link, opens an external URL in a new window
meeting ID: 937 6054 7545
password: 946296
Abstract:
We propose a framework for Bayesian Likelihood-Free Inference (LFI) based on Generalized Bayesian Inference. To define the generalized posterior, we use Scoring Rules (SRs), which evaluate probabilistic models given an observation. In LFI, we can sample from the model but not evaluate the likelihood; for this reason, we employ SRs with easy empirical estimators. Our framework includes novel approaches and popular LFI techniques (such as Bayesian Synthetic Likelihood) and enjoys posterior consistency in a well-specified setting when a strictly-proper SR is used (i.e., one whose expectation is uniquely minimized when the model corresponds to the data generating process). In general, our framework does not approximate the standard posterior; as such, it is possible to achieve outlier robustness, which we prove is the case for the Kernel and Energy Scores. Further, we show that our setup can utilise gradient based Markov chain Monte Carlo (MCMC) methods to sample from this proposed generalized posterior, hence making high dimensional parameter inference possible for models with intractable likelihood functions.
Event
S2 Z74, Science Park 2
alexandra.stadler@jku.at