no code implementations • 2 May 2024 • Bingshan Hu, Zhiming Huang, Tianyue H. Zhang, Mathias Lécuyer, Nidhi Hegde
We study Thompson Sampling-based algorithms for stochastic bandits with bounded rewards.
no code implementations • 12 Feb 2024 • Mishaal Kazmi, Hadrien Lautraite, Alireza Akbari, Mauricio Soroco, Qiaoyue Tang, Tao Wang, Sébastien Gambs, Mathias Lécuyer
We introduce a privacy auditing scheme for ML models that relies on membership inference attacks using generated data as "non-members".
1 code implementation • 21 Dec 2023 • Qiaoyue Tang, Frederick Shpilevskiy, Mathias Lécuyer
The Adam optimizer is a popular choice in contemporary deep learning, due to its strong empirical performance.
no code implementations • 21 Apr 2023 • Qiaoyue Tang, Mathias Lécuyer
We observe that the traditional use of DP with the Adam optimizer introduces a bias in the second moment estimation, due to the addition of independent noise in the gradient computation.
no code implementations • 26 Dec 2022 • Pierre Tholoniat, Kelly Kostopoulou, Mosharaf Chowdhury, Asaf Cidon, Roxana Geambasu, Mathias Lécuyer, Junfeng Yang
This DP budget can be regarded as a new type of compute resource in workloads of multiple ML models training on user data.
no code implementations • 3 Dec 2022 • Shiqi He, Qifan Yan, Feijie Wu, Lanjun Wang, Mathias Lécuyer, Ivan Beschastnikh
Federated learning (FL) is an effective technique to directly involve edge devices in machine learning training while preserving client privacy.
no code implementations • 28 Oct 2021 • Mathias Lécuyer, Sang Hoon Kim, Mihir Nanavati, Junchen Jiang, Siddhartha Sen, Amit Sharma, Aleksandrs Slivkins
We develop a methodology, called Sayer, that leverages implicit feedback to evaluate and train new system policies.
1 code implementation • 29 Jun 2021 • Tao Luo, Mingen Pan, Pierre Tholoniat, Asaf Cidon, Roxana Geambasu, Mathias Lécuyer
We describe PrivateKube, an extension to the popular Kubernetes datacenter orchestrator that adds privacy as a new type of resource to be managed alongside other traditional compute resources, such as CPU, GPU, and memory.
1 code implementation • 2 Mar 2021 • Mathias Lécuyer
Differential Privacy (DP) is the leading approach to privacy preserving deep learning.