8–12 juil. 2024
BÂTIMENT D’ENSEIGNEMENT MUTUALISÉ (BEM)
Fuseau horaire Europe/Paris

Optimizing Markov Chain Monte Carlo Convergence with Normalizing Flows and Gibbs Sampling

Non programmé
20m
BÂTIMENT D’ENSEIGNEMENT MUTUALISÉ (BEM)

BÂTIMENT D’ENSEIGNEMENT MUTUALISÉ (BEM)

Bâtiment d'Enseignement Mutualisé (BEM) Av. Fresnel, 91120 Palaiseau
Poster

Orateur

Christoph Schönle (CMAP, Ecole Polytechnique)

Description

Generative models have started to integrate into the scientific computing toolkit. One notable instance of this integration is the utilization of normalizing flows (NF) in the development of sampling and variational inference algorithms. This work introduces a novel algorithm, GflowMC, which relies on a Metropolis-within-Gibbs framework within the latent space of NFs. This approach addresses the challenge of vanishing acceptance probabilities often encountered when using NF-generated independent proposals, while retaining non-local updates, enhancing its suitability for sampling multi-modal distributions. We assess GflowMC’s performance concentrating on the Phi4 model from statistical mechanics. Our results demonstrate that by identifying an optimal size for partial updates, convergence of the Markov Chain Monte Carlo (MCMC) can be achieved faster than with full updates. Additionally, we explore the adaptability of GflowMC for biasing proposals towards increasing the update frequency of critical coordinates, such as coordinates highly correlated to mode switching in multi-modal targets.

Auteurs principaux

Christoph Schönle (CMAP, Ecole Polytechnique) Marylou Gabrié (École Polytechnique)

Documents de présentation

Aucun document.