Search Results for author: Theodoros Mamalis

Found 2 papers, 0 papers with code

Accelerated Almost-Sure Convergence Rates for Nonconvex Stochastic Gradient Descent using Stochastic Learning Rates

no code implementations25 Oct 2021 Theodoros Mamalis, Dusan Stipanovic, Petros Voulgaris

Theoretical results show accelerated almost-sure convergence rates of Stochastic Gradient Descent in a nonconvex setting when using an appropriate stochastic learning rate, compared to a deterministic-learning-rate scheme.

Stochastic Learning Rate Optimization in the Stochastic Approximation and Online Learning Settings

no code implementations20 Oct 2021 Theodoros Mamalis, Dusan Stipanovic, Petros Voulgaris

In this work, multiplicative stochasticity is applied to the learning rate of stochastic optimization algorithms, giving rise to stochastic learning-rate schemes.

Stochastic Optimization

Cannot find the paper you are looking for? You can Submit a new open access paper.