Search Results for author: Xiaoyin Hu

Found 3 papers, 0 papers with code

Developing Lagrangian-based Methods for Nonsmooth Nonconvex Optimization

no code implementations15 Apr 2024 Nachuan Xiao, Kuangyu Ding, Xiaoyin Hu, Kim-Chuan Toh

Preliminary numerical experiments on deep learning tasks illustrate that our proposed framework yields efficient variants of Lagrangian-based methods with convergence guarantees for nonconvex nonsmooth constrained optimization problems.

SGD-type Methods with Guaranteed Global Stability in Nonsmooth Nonconvex Optimization

no code implementations19 Jul 2023 Nachuan Xiao, Xiaoyin Hu, Kim-Chuan Toh

We further illustrate that our scheme yields variants of SGD-type methods, which enjoy guaranteed convergence in training nonsmooth neural networks.

Adam-family Methods for Nonsmooth Optimization with Convergence Guarantees

no code implementations6 May 2023 Nachuan Xiao, Xiaoyin Hu, Xin Liu, Kim-Chuan Toh

In this paper, we present a comprehensive study on the convergence properties of Adam-family methods for nonsmooth optimization, especially in the training of nonsmooth neural networks.

Cannot find the paper you are looking for? You can Submit a new open access paper.