Skip to content

Latest commit

 

History

History
53 lines (53 loc) · 2.19 KB

2024-06-30-even24a.md

File metadata and controls

53 lines (53 loc) · 2.19 KB
title section abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
Computation-information gap in high-dimensional clustering
Original Papers
We investigate the existence of a fundamental computation-information gap for the problem of clustering a mixture of isotropic Gaussian in the high-dimensional regime, where the ambient dimension $p$ is larger than the number $n$ of points. The existence of a computation-information gap in a specific Bayesian high-dimensional asymptotic regime has been conjectured by Lesieur et. al (2016) based on the replica heuristic from statistical physics. We provide evidence of the existence of such a gap generically in the high-dimensional regime $p\geq n$, by (i) proving a non-asymptotic low-degree polynomials computational barrier for clustering in high-dimension, matching the performance of the best known polynomial time algorithms, and by (ii) establishing that the information barrier for clustering is smaller than the computational barrier, when the number $K$ of clusters is large enough. These results are in contrast with the (moderately) low-dimensional regime $n\geq \text{poly}(p,K)$, where there is no computation-information gap for clustering a mixture of isotropic Gaussian. In order to prove our low-degree computational barrier, we develop sophisticated combinatorial arguments to upper-bound the mixed moments of the signal under a Bernoulli Bayesian model.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
even24a
0
Computation-information gap in high-dimensional clustering
1646
1712
1646-1712
1646
false
Even, Bertrand and Giraud, Christophe and Verzelen, Nicolas
given family
Bertrand
Even
given family
Christophe
Giraud
given family
Nicolas
Verzelen
2024-06-30
Proceedings of Thirty Seventh Conference on Learning Theory
247
inproceedings
date-parts
2024
6
30