March 3, 2025

ikayaniaamirshahzad@gmail.com

[2410.02490] Stochastic variance-reduced Gaussian variational inference on the Bures-Wasserstein manifold


View a PDF of the paper titled Stochastic variance-reduced Gaussian variational inference on the Bures-Wasserstein manifold, by Hoang Phuc Hau Luu and 4 other authors

View PDF
HTML (experimental)

Abstract:Optimization in the Bures-Wasserstein space has been gaining popularity in the machine learning community since it draws connections between variational inference and Wasserstein gradient flows. The variational inference objective function of Kullback-Leibler divergence can be written as the sum of the negative entropy and the potential energy, making forward-backward Euler the method of choice. Notably, the backward step admits a closed-form solution in this case, facilitating the practicality of the scheme. However, the forward step is not exact since the Bures-Wasserstein gradient of the potential energy involves “intractable” expectations. Recent approaches propose using the Monte Carlo method — in practice a single-sample estimator — to approximate these terms, resulting in high variance and poor performance. We propose a novel variance-reduced estimator based on the principle of control variates. We theoretically show that this estimator has a smaller variance than the Monte-Carlo estimator in scenarios of interest. We also prove that variance reduction helps improve the optimization bounds of the current analysis. We demonstrate that the proposed estimator gains order-of-magnitude improvements over the previous Bures-Wasserstein methods.

Submission history

From: Hoang Phuc Hau Luu [view email]
[v1]
Thu, 3 Oct 2024 13:55:49 UTC (3,252 KB)
[v2]
Fri, 28 Feb 2025 04:48:54 UTC (4,214 KB)



Source link

Leave a Comment