Skip to content

Latest commit

 

History

History
49 lines (49 loc) · 1.79 KB

2024-06-30-jiang24a.md

File metadata and controls

49 lines (49 loc) · 1.79 KB
title section abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
Algorithms for mean-field variational inference via polyhedral optimization in the Wasserstein space
Original Papers
We develop a theory of finite-dimensional polyhedral subsets over the Wasserstein space and optimization of functionals over them via first-order methods. Our main application is to the problem of mean-field variational inference, which seeks to approximate a distribution $\pi$ over $\mathbb{R}^d$ by a product measure $\pi^\star$. When $\pi$ is strongly log-concave and log-smooth, we provide (1) approximation rates certifying that $\pi^\star$ is close to the minimizer $\pi^\star_\diamond$ of the KL divergence over a \emph{polyhedral} set $\mathcal{P}_\diamond$, and (2) an algorithm for minimizing $\text{KL}(\cdot\|\pi)$ over $\mathcal{P}_\diamond$ with accelerated complexity $O(\sqrt \kappa \log(\kappa d/\varepsilon^2))$, where $\kappa$ is the condition number of $\pi$.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
jiang24a
0
Algorithms for mean-field variational inference via polyhedral optimization in the {W}asserstein space
2720
2721
2720-2721
2720
false
Jiang, Yiheng and Chewi, Sinho and Pooladian, Aram-Alexandre
given family
Yiheng
Jiang
given family
Sinho
Chewi
given family
Aram-Alexandre
Pooladian
2024-06-30
Proceedings of Thirty Seventh Conference on Learning Theory
247
inproceedings
date-parts
2024
6
30