Seminar in Numerical Analysis: Robert Gruhlke (FU Berlin)
Ensemble methods have become ubiquitous for solving Bayesian inference problems, in particular the efficient sampling from posterior densities. State-of-the-art subclasses of Markow-Chain-Monte-Carlo methods rely on gradient information of the log-density including Langevin samplers such as Ensemble Kalman Sampler (EKS) and Affine Invariant Langevin Dynamics (ALDI). These dynamics are described by stochastic differential equations (SDEs) with time homogeneous drift terms.
In this talk we present enhancement strategies of such ensemble methods based on sample enrichment and homotopy formalism, that ultimately lead to time-dependent drift terms that possible assimilate a larger class of target distributions while providing faster mixing times.
Furthermore, we present an alternative route to construct time-inhomogeneous drift terms based on reverse Diffusion processes that are popular in state-of-the-art Generative Modelling such as Diffusion maps. Here, we propose learning these log-densities by propagation of the target distribution through an Ornstein-Uhlenbeck process. For this, we solve the associated Hamilton-Jabobi-Bellman equation through an adaptive explicit Euler discretization using low-rank compression such as functional Tensor Trains for the spatial discretization.
For further information about the seminar, please visit this webpage.
Veranstaltung übernehmen als iCal