Adaptive Importance Sampling Message Passing
- Resource Type
- Conference
- Authors
- Akbayrak, Semih; Senoz, Ismail; Vries, Bert de
- Source
- 2022 IEEE International Symposium on Information Theory (ISIT) Information Theory (ISIT), 2022 IEEE International Symposium on. :1199-1204 Jun, 2022
- Subject
- Communication, Networking and Broadcast Technologies
Monte Carlo methods
Message passing
Stochastic processes
Optimization methods
Estimation
Programming
Probabilistic logic
approximate Bayesian inference
importance sampling
message passing
variational inference
- Language
- ISSN
- 2157-8117
The aim of Probabilistic Programming (PP) is to automate inference in probabilistic models. One efficient realization of PP-based inference concerns variational message passing-based (VMP) inference in a factor graph. VMP is efficient but in principle only leads to closed-form update rules in case the model consists of conjugate and/or conditionally conjugate factor pairs. Recently, Extended Variational Message Passing (EVMP) has been proposed to broaden the applicability of VMP by importance sampling-based particle methods for non-linear and non-conjugate factor pairs. EVMP automates the importance sampling procedure by employing forward messages as proposal distributions, which unfortunately may lead to inaccurate estimation results and numerical instabilities in case the forward message is not a good representative of the unknown correct posterior. This paper addresses this issue by integrating an adaptive importance sampling procedure with message passing-based inference. The resulting method is a hyperparameter-free approximate inference engine that combines recent advances in stochastic adaptive importance sampling and optimization methods. We provide an implementation for the proposed method in the Julia package ForneyLab.jl.