Efficient Utility Improvement for Location Privacy

Konstantinos Chatzikokolakis 1 , Ehab ElSalamouny 2  and Catuscia Palamidessi 3
  • 1 CNRS, Inria and LIX, France
  • 2 Inria, France and Faculty of Computers and Informatics, Egypt
  • 3 Inria and LIX, France


The continuously increasing use of location-based services poses an important threat to the privacy of users. A natural defense is to employ an obfuscation mechanism, such as those providing geo-indistinguishability, a framework for obtaining formal privacy guarantees that has become popular in recent years.

Ideally, one would like to employ an optimal obfuscation mechanism, providing the best utility among those satisfying the required privacy level. In theory optimal mechanisms can be constructed via linear programming. In practice, however, this is only feasible for a radically small number of locations. As a consequence, all known applications of geo-indistinguishability simply use noise drawn from a planar Laplace distribution.

In this work, we study methods for substantially improving the utility of location obfuscation, while maintaining practical applicability as a main goal. We provide such solutions for both infinite (continuous or discrete) as well as large but finite domains of locations, using a Bayesian remapping procedure as a key ingredient. We evaluate our techniques in two real world complete datasets, without any restriction on the evaluation area, and show important utility improvements with respect to the standard planar Laplace approach.

If the inline PDF is not rendering correctly, you can download the PDF file here.

  • [1] K. Orland, “Stalker Victims Should Check For GPS.” The Associated Press, 2003. http://www.cbsnews.com/news/stalker-victims-should-check-for-gps/.

  • [2] J. Brownlee, “This Creepy App Isn’t Just Stalking Women Without Their Knowledge, It’s A Wake-Up Call About Facebook Privacy (Update),” 2012. http://www.cultofmac.com/157641/.

  • [3] J. Simerman, “FasTrak to courthouse.” East Bay Times, 2007. http://www.eastbaytimes.com/2007/06/05/fastrak-to-courthouse/.

  • [4] D. Ashbrook and T. Starner, “Using gps to learn significant locations and predict movement across multiple users,” Personal and Ubiquitous Computing, vol. 7, no. 5, pp. 275–286, 2003.

  • [5] R. Shokri, G. Theodorakopoulos, C. Troncoso, J.-P. Hubaux, and J.-Y. L. Boudec, “Protecting location privacy: optimal strategy against localization attacks,” in Proc. of CCS, pp. 617–627, ACM, 2012.

  • [6] M. E. Andrés, N. E. Bordenabe, K. Chatzikokolakis, and C. Palamidessi, “Geo-indistinguishability: differential privacy for location-based systems,” in Proc. of CCS, pp. 901–914, ACM, 2013.

  • [7] N. E. Bordenabe, K. Chatzikokolakis, and C. Palamidessi, “Optimal geo-indistinguishable mechanisms for location privacy,” in Proc. of CCS, 2014.

  • [8] R. Shokri, “Privacy games: Optimal user-centric data obfuscation,” Proceedings on Privacy Enhancing Technologies, vol. 2015, no. 2, pp. 299–315, 2015.

  • [9] C. Dwork, “Differential privacy,” in Proc. of ICALP, vol. 4052 of LNCS, pp. 1–12, Springer, 2006.

  • [10] “Location guard.” https://github.com/chatziko/location-guard.

  • [11] K. Fawaz and K. G. Shin, “Location privacy protection for smartphone users,” in Proc. of CCS, pp. 239–250, ACM Press, 2014.

  • [12] K. Fawaz, H. Feng, and K. G. Shin, “Anatomization and protection of mobile apps’ location privacy threats,” in Proc. of USENIX Security 2015, pp. 753–768, USENIX Association, 2015.

  • [13] C. Ma and C. W. Chen, “Nearby friend discovery with geo-indistinguishability to stalkers,” Procedia Computer Science, vol. 34, pp. 352 – 359, 2014.

  • [14] “Qgis processing provider plugin.” https://github.com/SpatialVision/differential_privacy.

  • [15] L. Pournajaf, L. Xiong, V. Sunderam, and X. Xu, “Stac: Spatial task assignment for crowd sensing with cloaked participant locations,” in Proceedings of the 23rd SIGSPATIAL Int. Conf. on Advances in Geographic Information Systems, GIS ’15, pp. 90:1–90:4, ACM, 2015.

  • [16] Y. Xiao and L. Xiong, “Protecting locations with differential privacy under temporal correlations,” in Proc. of CCS, pp. 1298–1309, ACM, 2015.

  • [17] A. Ghosh, T. Roughgarden, and M. Sundararajan, “Universally utility-maximizing privacy mechanisms,” in Proc. of STOC, pp. 351–360, ACM, 2009.

  • [18] K. Chatzikokolakis, C. Palamidessi, and M. Stronati, “Constructing elastic distinguishability metrics for location privacy,” PoPETs, vol. 2015, no. 2, pp. 156–170, 2015.

  • [19] E. ElSalamouny, K. Chatzikokolakis, and C. Palamidessi, “Generalized differential privacy: Regions of priors that admit robust optimal mechanisms,” in Horizons of the Mind, vol. 8464 of LNCS, pp. 292–318, Springer Int. Publishing, 2014.

  • [20] M. Gruteser and D. Grunwald, “Anonymous usage of location-based services through spatial and temporal cloaking,” in Proc. of MobiSys, USENIX, 2003.

  • [21] P. Samarati and L. Sweeney, “Generalizing data to provide anonymity when disclosing information (abstract),” in Proc. of PODS, pp. 188–188, ACM Press, 1998.

  • [22] L. Sweeney, “k-anonymity: A model for protecting privacy,” Int. Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, vol. 10, no. 5, pp. 557–570, 2002.

  • [23] L. Sweeney, “Achieving k-anonymity privacy protection using generalization and suppression,” Int. Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, vol. 10, no. 5, pp. 571–588, 2002.

  • [24] P. Samarati, “Protecting respondents’ identities in microdata release,” IEEE Trans. Knowl. Data Eng, vol. 13, no. 6, pp. 1010–1027, 2001.

  • [25] A. Machanavajjhala, D. Kifer, J. Gehrke, and M. Venkitasubramaniam, “l-diversity: Privacy beyond k-anonymity,” ACM Trans. on Knowledge Discovery from Data (TKDD), vol. 1, no. 1, p. 3, 2007.

  • [26] N. Li, T. Li, and S. Venkatasubramanian, “t-closeness: Privacy beyond k-anonymity and l-diversity.,” in ICDE, vol. 7, pp. 106–115, 2007.

  • [27] A. Solanas, F. Sebé, and J. Domingo-Ferrer, “Microaggregation-based heuristics for p-sensitive k-anonymity: one step beyond,” in Proc. of PAIS 2008, ACM Int. Conf. Proceeding Series, pp. 61–69, ACM, 2008.

  • [28] A. R. Beresford and F. Stajano, “Location privacy in pervasive computing,” IEEE Pervasive Computing, vol. 2, no. 1, pp. 46–55, 2003.

  • [29] A. Machanavajjhala, D. Kifer, J. M. Abowd, J. Gehrke, and L. Vilhuber, “Privacy: Theory meets practice on the map,” in Proc. of ICDE, pp. 277–286, IEEE, 2008.

  • [30] S.-S. Ho and S. Ruan, “Differential privacy for location pattern mining,” in Proc. of SPRINGL, pp. 17–24, ACM, 2011.

  • [31] R. Dewri, “Local differential perturbations: Location privacy under approximate knowledge attackers,” IEEE Trans. on Mobile Computing, vol. 99, no. PrePrints, p. 1, 2012.

  • [32] F. Durr, P. Skvortsov, and K. Rothermel, “Position sharing for location privacy in non-trusted systems,” in Proc. of PerCom 2011, pp. 189–196, IEEE, 2011.

  • [33] E. ElSalamouny and S. Gambs, “Differential privacy models for location-based services,” Trans. on Data Privacy, vol. 9, no. 1, pp. 15–48, 2016.

  • [34] C. A. Ardagna, M. Cremonini, E. Damiani, S. D. C. di Vimercati, and P. Samarati, “Location privacy protection through obfuscation-based techniques,” in Proc. of DAS, vol. 4602 of LNCS, pp. 47–60, Springer, 2007.

  • [35] B. Bamba, L. Liu, P. Pesti, and T. Wang, “Supporting anonymous location queries in mobile environments with privacygrid,” in Proc. of WWW, pp. 237–246, ACM, 2008.

  • [36] M. Duckham and L. Kulik, “A formal model of obfuscation and negotiation for location privacy,” in Proc. of PERVASIVE, vol. 3468 of LNCS, pp. 152–170, Springer, 2005.

  • [37] M. Xue, P. Kalnis, and H. Pung, “Location diversity: Enhanced privacy protection in location based services,” in Proc. of LoCA, vol. 5561 of LNCS, pp. 70–87, Springer, 2009.

  • [38] B. Gedik and L. Liu, “Location privacy in mobile systems: A personalized anonymization model,” in Proc. of ICDCS, pp. 620–629, IEEE, 2005.

  • [39] K. Chatzikokolakis, M. E. Andrés, N. E. Bordenabe, and C. Palamidessi, “Broadening the scope of Differential Privacy using metrics,” in Proc. of PETS, vol. 7981 of LNCS, pp. 82–102, Springer, 2013.

  • [40] C. Dwork, A. Roth, et al., “The algorithmic foundations of differential privacy,” Foundations and Trends® in Theor. Comp. Sci., vol. 9, no. 3–4, pp. 211–407, 2014.

  • [41] L. Cooper and I. Katz, “The weber problem revisited,” Computers & Mathematics with Applications, vol. 7, no. 3, pp. 225 – 234, 1981.

  • [42] C. Dwork, K. Kenthapadi, F. McSherry, I. Mironov, and M. Naor, “Our data, ourselves: Privacy via distributed noise generation,” in Proc. of EUROCRYPT, vol. 4004 of LNCS, pp. 486–503, Springer, 2006.

  • [43] R. Shokri, G. Theodorakopoulos, J.-Y. L. Boudec, and J.-P. Hubaux, “Quantifying location privacy,” in Proc. of S&P, pp. 247–262, IEEE, 2011.

  • [44] K. Chatzikokolakis, C. Palamidessi, and M. Stronati, “A predictive differentially-private mechanism for mobility traces,” in Proc. of PETS, vol. 8555 of LNCS, pp. 21–41, Springer, 2014.


Journal + Issues