Privacy Loss Classes: The Central Limit Theorem in Differential Privacy

Abstract

Quantifying the privacy loss of a privacy-preserving mechanism on potentially sensitive data is a complex and well-researched topic; the de-facto standard for privacy measures are ε-differential privacy (DP) and its versatile relaxation (ε, δ)-approximate differential privacy (ADP). Recently, novel variants of (A)DP focused on giving tighter privacy bounds under continual observation. In this paper we unify many previous works via the privacy loss distribution (PLD) of a mechanism. We show that for non-adaptive mechanisms, the privacy loss under sequential composition undergoes a convolution and will converge to a Gauss distribution (the central limit theorem for DP). We derive several relevant insights: we can now characterize mechanisms by their privacy loss class, i.e., by the Gauss distribution to which their PLD converges, which allows us to give novel ADP bounds for mechanisms based on their privacy loss class; we derive exact analytical guarantees for the approximate randomized response mechanism and an exact analytical and closed formula for the Gauss mechanism, that, given ε, calculates δ, s.t., the mechanism is (ε, δ)-ADP (not an over-approximating bound).

If the inline PDF is not rendering correctly, you can download the PDF file here.

  • [1] “CoverUp Measurement Data,” http://e.mohammadi.eu/paper/coverup_measurements.zip, 2018, [Online].

  • [2] M. Abadi, A. Chu, I. Goodfellow, H. B. McMahan, I. Mironov, K. Talwar, and L. Zhang, “Deep Learning with Differential Privacy,” in Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security (CCS). ACM, 2016, pp. 308–318.

  • [3] M. Abramowitz and I. A. Stegun, Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables, 1st ed. New York: Dover, 1972.

  • [4] B. Balle, G. Barthe, and M. Gaboardi, “Privacy amplification by subsampling: Tight analyses via couplings and divergences,” in Neural Information Processing Systems (NIPS), 2018.

  • [5] B. Balle and Y. Wang, “Improving the gaussian mechanism for differential privacy: Analytical calibration and optimal denoising,” in Proceedings of the 35th International Conference on Machine Learning (ICML), 2018, pp. 403–412.

  • [6] P. Billingsley, Probability and measure. John Wiley & Sons, 2008.

  • [7] V. I. Bogachev, Measure theory. Springer Science & Business Media, 2007, vol. 1.

  • [8] M. Bun and T. Steinke, “Concentrated Differential Privacy: Simplifications, Extensions, and Lower Bounds,” in Theory of Cryptography (TCC). Springer, 2016, pp. 635–658.

  • [9] I. Dinur and K. Nissim, “Revealing Information While Preserving Privacy,” in Proceedings of the Twenty-second ACM SIGMOD-SIGACT-SIGART Symposium on Principles of Database Systems (PODS). ACM, 2003, pp. 202–210.

  • [10] C. Dwork, K. Kenthapadi, F. McSherry, I. Mironov, and M. Naor, “Our Data, Ourselves: Privacy Via Distributed Noise Generation,” in Advances in Cryptology - EURO-CRYPT 2006. Springer, 2006, pp. 486–503.

  • [11] C. Dwork and A. Roth, “The Algorithmic Foundations of Differential Privacy,” Foundations and Trends® in Theoretical Computer Science, vol. 9, no. 3–4, pp. 211–407, 2014.

  • [12] C. Dwork and G. N. Rothblum, “Concentrated Differential Privacy,” CoRR, vol. abs/1603.01887, 2016.

  • [13] U. Erlingsson, V. Pihur, and A. Korolova, “Rappor: Randomized aggregatable privacy-preserving ordinal response,” in Proceedings of the 2014 ACM SIGSAC Conference on Computer and Communications Security (CCS). ACM, 2014.

  • [14] M. Götz, A. Machanavajjhala, G. Wang, X. Xiao, and J. Gehrke, “Privacy in search logs,” CoRR, vol. abs/0904.0682, 2009.

  • [15] B. Gough, GNU Scientific Library Reference Manual - Third Edition, 3rd ed. Network Theory Ltd., 2009.

  • [16] P. Kairouz, S. Oh, and P. Viswanath, “The composition theorem for differential privacy,” IEEE Transactions on Information Theory, vol. 63, no. 6, pp. 4037–4049, 2017.

  • [17] J. W. Lindeberg, “Eine neue herleitung des exponentialgesetzes in der wahrscheinlichkeitsrechnung,” Mathematische Zeitschrift, vol. 15, no. 1, pp. 211–225, 1922.

  • [18] A. Machanavajjhala, D. Kifer, J. Abowd, J. Gehrke, and L. Vilhuber, “Privacy: Theory meets practice on the map,” in 2008 IEEE 24th International Conference on Data Engineering, April 2008, pp. 277–286.

  • [19] S. Meiser, “Approximate and Probabilistic Differential Privacy Definitions,” https://eprint.iacr.org/2018/277, 2018.

  • [20] S. Meiser and E. Mohammadi, “Tight on Budget? Tight Bounds for r-Fold Approximate Differential Privacy,” in Proceedings of the 25th ACM Conference on Computer and Communications Security (CCS). ACM, 2018.

  • [21] I. Mironov, “Rényi Differential Privacy,” in Proceedings of the 30th IEEE Computer Security Foundations Symposium (CSF). IEEE, 2017, pp. 263–275.

  • [22] J. Murtagh and S. Vadhan, “The complexity of computing the optimal composition of differential privacy,” in Proceedings, Part I, of the 13th International Conference on Theory of Cryptography (TCC). Springer, 2016, pp. 157–175.

  • [23] I. Pinelis, “Chapter 4 - on the nonuniform berry–esseen bound,” in Inequalities and Extremal Problems in Probability and Statistics. Academic Press, 2017, pp. 103 – 138.

  • [24] EU Regulation, “European data protection regulation (GDPR),” Off J Eur Union, vol. L119, pp. 1–88, 4th May 2016.

  • [25] D. Sommer, A. Dhar, L. Malitsa, E. Mohammadi, D. Ronzani, and S. Capkun, “Anonymous Communication for Messengers via “Forced” Participation,” Technical report, available under https://eprint.iacr.org/2017/191, 2017.

  • [26] S. Vadhan, G. N. Rothblum, and C. Dwork, “Boosting and differential privacy,” in 2010 IEEE 51st Annual Symposium on Foundations of Computer Science (FOCS), 2010, pp. 51–60.

  • [27] J. van den Hooff, D. Lazar, M. Zaharia, and N. Zeldovich, “Vuvuzela: Scalable private messaging resistant to traffic analysis,” in Proceedings of the 25th Symposium on Operating Systems Principles (SOSP). ACM, 2015, pp. 137–152.

  • [28] Y.-X. Wang, B. Balle, and S. Kasiviswanathan, “Subsampled Rényi differential privacy and analytical moments accountant,” arXiv preprint arXiv:1808.00087, 2018.

OPEN ACCESS

Journal + Issues

Search