no code implementations • 8 Jun 2022 • Yu-Guan Hsieh, Yassine Laguel, Franck Iutzeler, Jérôme Malick
We consider decentralized optimization problems in which a number of agents collaborate to minimize the average of their local functions by exchanging over an underlying communication graph.
1 code implementation • 17 Dec 2021 • Krishna Pillutla, Yassine Laguel, Jérôme Malick, Zaid Harchaoui
We present a federated learning framework that is designed to robustly deliver good predictive performance across individual clients with heterogeneous data.
1 code implementation • 30 Sep 2020 • Yassine Laguel, Jérôme Malick, Zaid Harchaoui
Classical supervised learning via empirical risk (or negative log-likelihood) minimization hinges upon the assumption that the testing distribution coincides with the training distribution.
1 code implementation • arXiv preprint 2020 • Yassine Laguel, Krishna Pillutla, Jérôme Malick, Zaid Harchaoui
We propose a federated learning framework to handle heterogeneous client devices which do not conform to the population data distribution.