no code implementations • 29 Apr 2024 • Dmitriy Kunisky, Cristopher Moore, Alexander S. Wein
This basis lets us unify and strengthen previous results on low-degree hardness, giving a combinatorial explanation of the hardness transition and of a continuum of subexponential-time algorithms that work below it, and proving tight lower bounds against low-degree polynomials for recovering rather than just detecting the signal.
no code implementations • 12 Mar 2024 • Dmitriy Kunisky
These results are the first computational lower bounds against any large class of algorithms for all of these models when the channel is not one of a few special cases, and thereby give the first substantial evidence for the universality of several statistical-to-computational gaps.
no code implementations • 22 May 2020 • Yunzi Ding, Dmitriy Kunisky, Alexander S. Wein, Afonso S. Bandeira
A matrix has the $(s,\delta)$-$\mathsf{RIP}$ property if behaves as a $\delta$-approximate isometry on $s$-sparse vectors.
no code implementations • 26 Jul 2019 • Dmitriy Kunisky, Alexander S. Wein, Afonso S. Bandeira
These notes survey and explore an emerging method, which we call the low-degree method, for predicting and understanding statistical-versus-computational tradeoffs in high-dimensional inference problems.
no code implementations • 26 Jul 2019 • Yunzi Ding, Dmitriy Kunisky, Alexander S. Wein, Afonso S. Bandeira
Prior work has shown that when the signal-to-noise ratio ($\lambda$ or $\beta\sqrt{N/n}$, respectively) is a small constant and the fraction of nonzero entries in the planted vector is $\|x\|_0 / n = \rho$, it is possible to recover $x$ in polynomial time if $\rho \lesssim 1/\sqrt{n}$.