Search Results

You are looking at 1 - 10 of 23 items for :

  • IT-Security and Cryptology x
Clear All
Open access

Tibor Jager and Andy Rupp

Abstract

We formalize and construct black-box accumulation (BBA), a useful building block for numerous important user-centric protocols including loyalty systems, refund systems, and incentive systems (as, e.g., employed in participatory sensing and vehicle-to-grid scenarios). A core requirement all these systems share is a mechanism to let users collect and sum up values (call it incentives, bonus points, reputation points, etc.) issued by some other parties in a privacy-preserving way such that curious operators may not be able to link the different transactions of a user. At the same time, a group of malicious users may not be able to cheat the system by pretending to have collected a higher amount than what was actually issued to them.

As a first contribution, we fully formalize the core functionality and properties of this important building block. Furthermore, we present a generic and non-interactive construction of a BBA system based on homomorphic commitments, digital signatures, and non-interactive zero-knowledge proofs of knowledge. For our construction, we formally prove security and privacy properties. Finally, we propose a concrete instantiation of our construction using Groth-Sahai commitments and proofs as well as the optimal structure-preserving signature scheme of Abe et al. and analyze its efficiency.

Open access

Apostolos Pyrgelis, Carmela Troncoso and Emiliano De Cristofaro

LBS , 2011. [22] E. J. Horvitz, J. Apacible, R. Sarin, and L. Liao. Prediction, expectation, and surprise: Methods, designs, and study of a deployed traffic forecasting service. arXiv preprint arXiv:1207.1352 , 2012. [23] J. Kaneps. Apple’s ’differential privacy’ is about collecting your data—but not your data. https://www.wired.com/2016/06/apples-differential-privacy-collecting-data/ , 2016. [24] C. Kopp, M. Mock, and M. May. Privacy-preserving distributed monitoring of visit quantities. In SIGSPATIAL , 2012. [25] J. Krumm. Inference

Open access

Tao Wang and Ian Goldberg

. [9] G. Greenwald. XKeyscore: NSA tool collects ’nearly everything a user does on the internet’. http://www.theguardian.com/world/2013/jul/31/nsa-top-secret-program-online-data, July 2013. Accessed Feb. 2015. [10] J. Hayes and G. Danezis. k-fingerprinting: a Robust Scalable Website Fingerprinting Technique. arXiv:1509.00789v3, 19 Feb 2016. [11] D. Herrmann, R. Wendolsky, and H. Federrath. Website Fingerprinting: Attacking Popular Privacy Enhancing Technologies with the Multinomial Naïve-Bayes Classifier. In Proceedings of the 2009 ACM

Open access

Chad Spensky, Jeffrey Stewart, Arkady Yerukhimovich, Richard Shay, Ari Trachtenberg, Rick Housley and Robert K. Cunningham

/06/exercising-our-remote-application.html, June 2010. [52] C. Beaumont, “Apple’s Jobs confirms iPhone ‘kill switch’,” http://www.telegraph.co.uk/technology/3358134/Apples-Jobs-confirms-iPhone-kill-switch.html, Aug. 2008. [53] Google, “What data does Google collect?” https://privacy.google.com/data-we-collect.html. [54] Trustonic, “Trustonic,” https://www.trustonic.com/. [55] Samsung, “Samsung knox,” http://www.samsungknox.com/, Nov. 2015. [56] R. Welton, “Remote code execution as system user on

Open access

Sara Krehbiel

Abstract

In many real world scenarios, terms of service allow a producer of a service to collect data from its users. Producers value data but often only compensate users for their data indirectly with reduced prices for the service. This work considers how a producer (data analyst) may offer differential privacy as a premium service for its users (data subjects), where the degree of privacy offered may itself depend on the user data. Along the way, it strengthens prior negative results for privacy markets to the pay-for-privacy setting and develops a new notion of endogenous differential privacy. A positive result for endogenous privacy is given in the form of a class of mechanisms for privacy-as-a-service markets that 1) determine ɛ using the privacy and accuracy preferences of a heterogeneous body of data subjects and a single analyst, 2) collect and distribute payments for the chosen level of privacy, and 3) privately analyze the database. These mechanisms are endogenously differentially private with respect to data subjects’ privacy preferences as well as their private data, they directly elicit data subjects’ true preferences, and they determine a level of privacy that is efficient given all parties’ preferences.

Open access

Mohammad Alaggan, Mathieu Cunche and Sébastien Gambs

Abstract

As communications-enabled devices are becoming more ubiquitous, it becomes easier to track the movements of individuals through the radio signals broadcasted by their devices. Thus, while there is a strong interest for physical analytics platforms to leverage this information for many purposes, this tracking also threatens the privacy of individuals. To solve this issue, we propose a privacy-preserving solution for collecting aggregate mobility patterns while satisfying the strong guarantee of ε-differential privacy. More precisely, we introduce a sanitization mechanism for efficient, privacy-preserving and non-interactive approximate distinct counting for physical analytics based on perturbed Bloom filters called Pan-Private BLIP. We also extend and generalize previous approaches for estimating distinct count of events and joint events (i.e., intersection and more generally t-out-of-n cardinalities). Finally, we evaluate expirementally our approach and compare it to previous ones on real datasets.

Open access

Christoph Bösch, Benjamin Erb, Frank Kargl, Henning Kopp and Stefan Pfattheicher

Abstract

Privacy strategies and privacy patterns are fundamental concepts of the privacy-by-design engineering approach. While they support a privacy-aware development process for IT systems, the concepts used by malicious, privacy-threatening parties are generally less understood and known. We argue that understanding the “dark side”, namely how personal data is abused, is of equal importance. In this paper, we introduce the concept of privacy dark strategies and privacy dark patterns and present a framework that collects, documents, and analyzes such malicious concepts. In addition, we investigate from a psychological perspective why privacy dark strategies are effective. The resulting framework allows for a better understanding of these dark concepts, fosters awareness, and supports the development of countermeasures. We aim to contribute to an easier detection and successive removal of such approaches from the Internet to the benefit of its users.

Open access

Anupam Das, Nikita Borisov and Edward Chou

Abstract

The ability to track users’ activities across different websites and visits is a key tool in advertising and surveillance. The HTML5 DeviceMotion interface creates a new opportunity for such tracking via fingerprinting of smartphone motion sensors. We study the feasibility of carrying out such fingerprinting under real-world constraints and on a large scale. In particular, we collect measurements from several hundred users under realistic scenarios and show that the state-of-the-art techniques provide very low accuracy in these settings. We then improve fingerprinting accuracy by changing the classifier as well as incorporating auxiliary information. We also show how to perform fingerprinting in an open-world scenario where one must distinguish between known and previously unseen users.

We next consider the problem of developing fingerprinting countermeasures; we evaluate the usability of a previously proposed obfuscation technique and a newly developed quantization technique via a large-scale user study. We find that both techniques are able to drastically reduce fingerprinting accuracy without significantly impacting the utility of the sensors in web applications.

Open access

Anastasia Shuba, Athina Markopoulou and Zubair Shafiq

Abstract

Although advertising is a popular strategy for mobile app monetization, it is often desirable to block ads in order to improve usability, performance, privacy, and security. In this paper, we propose NoMoAds to block ads served by any app on a mobile device. NoMoAds leverages the network interface as a universal vantage point: it can intercept, inspect, and block outgoing packets from all apps on a mobile device. NoMoAds extracts features from packet headers and/or payload to train machine learning classifiers for detecting ad requests. To evaluate NoMoAds, we collect and label a new dataset using both EasyList and manually created rules. We show that NoMoAds is effective: it achieves an F-score of up to 97.8% and performs well when deployed in the wild. Furthermore, NoMoAds is able to detect mobile ads that are missed by EasyList (more than one-third of ads in our dataset). We also show that NoMoAds is efficient: it performs ad classification on a per-packet basis in real-time. To the best of our knowledge, NoMoAds is the first mobile ad-blocker to effectively and efficiently block ads served across all apps using a machine learning approach.

Open access

Emmanuel Bello-Ogunu and Mohamed Shehab

Abstract

Research shows that context is important to the privacy perceptions associated with technology. With Bluetooth Low Energy beacons, one of the latest technologies for providing proximity and indoor tracking, the current identifiers that characterize a beacon are not sufficient for ordinary users to make informed privacy decisions about the location information that could be shared. One solution would be to have standardized category and privacy labels, produced by beacon providers or an independent third-party. An alternative solution is to find an approach driven by users, for users. In this paper, we propose a novel crowdsourcing based approach to introduce elements of context in beacon encounters.We demonstrate the effectiveness of this approach through a user study, where participants use a crowd-based mobile app designed to collect beacon category and privacy information as a scavenger hunt game. Results show that our approach was effective in helping users label beacons according to the specific context of a given beacon encounter, as well as the privacy perceptions associated with it. This labeling was done with an accuracy of 92%, and with an acceptance rate of 82% of all recommended crowd labels. Lastly, we conclusively show how crowdsourcing for context can be used towards a user-centric framework for privacy management during beacon encounters.