Locally Differentially Private Minimum Finding

27 May 2019  ·  Kazuto Fukuchi, Chia-Mu Yu, Arashi Haishima, Jun Sakuma ·

We investigate a problem of finding the minimum, in which each user has a real value and we want to estimate the minimum of these values under the local differential privacy constraint. We reveal that this problem is fundamentally difficult, and we cannot construct a mechanism that is consistent in the worst case. Instead of considering the worst case, we aim to construct a private mechanism whose error rate is adaptive to the easiness of estimation of the minimum. As a measure of easiness, we introduce a parameter $\alpha$ that characterizes the fatness of the minimum-side tail of the user data distribution. As a result, we reveal that the mechanism can achieve $O((\ln^6N/\epsilon^2N)^{1/2\alpha})$ error without knowledge of $\alpha$ and the error rate is near-optimal in the sense that any mechanism incurs $\Omega((1/\epsilon^2N)^{1/2\alpha})$ error. Furthermore, we demonstrate that our mechanism outperforms a naive mechanism by empirical evaluations on synthetic datasets. Also, we conducted experiments on the MovieLens dataset and a purchase history dataset and demonstrate that our algorithm achieves $\tilde{O}((1/N)^{1/2\alpha})$ error adaptively to $\alpha$.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here