Go to JKU Homepage
Institute of Applied Statistics
What's that?

Institutes, schools, other departments, and programs create their own web content and menus.

To help you better navigate the site, see here where you are at the moment.

Detail

Research Seminar at the Institute of Applied Statistics

January, 26th - Ritabrata ‘Rito’ Dutta, University of Warwick, UK (joint work with Lorenzo Pachhiardi and Sherman Khoo): Sampling Likelihood-Free ‘generalized' posteriors with Stochastic Gradient MCMC

[Translate to Englisch:]
[Translate to Englisch:]

zoom link, opens an external URL in a new window

meeting ID: 937 6054 7545

password: 946296

Abstract:

We propose a framework for Bayesian Likelihood-Free Inference (LFI) based on Generalized Bayesian Inference. To define the generalized posterior, we use Scoring Rules (SRs), which evaluate probabilistic models given an observation. In LFI, we can sample from the model but not evaluate the likelihood; for this reason, we employ SRs with easy empirical estimators. Our framework includes novel approaches and popular LFI techniques (such as Bayesian Synthetic Likelihood) and enjoys posterior consistency in a well-specified setting when a strictly-proper SR is used (i.e., one whose expectation is uniquely minimized when the model corresponds to the data generating process). In general, our framework does not approximate the standard posterior; as such, it is possible to achieve outlier robustness, which we prove is the case for the Kernel and Energy Scores. Further, we show that our setup can utilise gradient based Markov chain Monte Carlo (MCMC) methods to sample from this proposed generalized posterior, hence making high dimensional parameter inference possible for models with intractable likelihood functions.

Event

Time & date

January 26, 2023

15:30 - 17:00 PM

Add to my calendar

Location

S2 Z74, Science Park 2

Contact

milan.stehlik@jku.at