no code implementations • 2 Feb 2024 • Benjamin Gess, Sebastian Kassing, Nimit Rana
We give quantitative estimates for the rate of convergence of Riemannian stochastic gradient descent (RSGD) to Riemannian gradient flow and to a diffusion process, the so-called Riemannian stochastic modified flow (RSMF).
no code implementations • 14 Feb 2023 • Benjamin Gess, Sebastian Kassing, Vitalii Konarovskyi
We propose new limiting dynamics for stochastic gradient descent in the small learning rate regime called stochastic modified flows.
no code implementations • 7 Feb 2023 • Benjamin Gess, Sebastian Kassing
We consider the momentum stochastic gradient descent scheme (MSGD) and its continuous-in-time counterpart in the context of non-convex optimization.
no code implementations • 12 Jul 2022 • Benjamin Gess, Rishabh S. Gvalani, Vitalii Konarovskyi
The convergence of stochastic interacting particle systems in the mean-field limit to solutions of conservative stochastic partial differential equations is established, with optimal rate of convergence.
no code implementations • 22 Dec 2020 • Ľubomír Baňas, Benjamin Gess, Christian Vieth
We study a general class of singular degenerate parabolic stochastic partial differential equations (SPDEs) which include, in particular, the stochastic porous medium equations and the stochastic fast diffusion equation.
Numerical Analysis Numerical Analysis
no code implementations • 3 Dec 2020 • Nicolas Dirr, Benjamin Fehrman, Benjamin Gess
In the small-noise limit, we show that the fluctuations of the solutions are to first-order the same as the fluctuations of the particle system.
Probability Analysis of PDEs
no code implementations • 2 Apr 2019 • Benjamin Fehrman, Benjamin Gess, Arnulf Jentzen
We prove the local convergence to minima and estimates on the rate of convergence for the stochastic gradient descent method in the case of not necessarily globally convex nor contracting objective functions.