High-accuracy log-concave sampling with stochastic queries
每日信息看板 · 2026-02-15
2026-02-15T23:19:07Z
Published
AI 总结
We show that high-accuracy guarantees for log-concave sampling -- that is, iteration and query complexities which scale as $\mathrm{poly}\log(1/δ)$, where $δ$ …
- We show that high-accuracy guarantees for log-concave sampling -- that is, iteration and query complexities which scale as $\mathrm{poly}\l…
- Notably, this exhibits a separation with the problem of convex optimization, where stochasticity (even additive Gaussian noise) in the grad…
- We also give an information-theoretic argument that light-tailed stochastic gradients are necessary for high accuracy: for example, in the …
- Our framework also provides similar high accuracy guarantees under stochastic zeroth order (value) queries
#arXiv #paper #研究/论文
内容摘录
We show that high-accuracy guarantees for log-concave sampling -- that is, iteration and query complexities which scale as $\mathrm{poly}\log(1/δ)$, where $δ$ is the desired target accuracy -- are achievable using stochastic gradients with subexponential tails. Notably, this exhibits a separation with the problem of convex optimization, where stochasticity (even additive Gaussian noise) in the gradient oracle incurs $\mathrm{poly}(1/δ)$ queries. We also give an information-theoretic argument that light-tailed stochastic gradients are necessary for high accuracy: for example, in the bounded variance case, we show that the minimax-optimal query complexity scales as $Θ(1/δ)$. Our framework also provides similar high accuracy guarantees under stochastic zeroth order (value) queries.