Irwin Reyes, Primal Wijesekera, Joel Reardon, Amit Elazari Bar On, Abbas Razaghpanah, Narseo Vallina-Rodriguez and Serge Egelman
We present a scalable dynamic analysis framework that allows for the automatic evaluation of the privacy behaviors of Android apps. We use our system to analyze mobile apps’ compliance with the Children’s Online Privacy Protection Act (COPPA), one of the few stringent privacy laws in the U.S. Based on our automated analysis of 5,855 of the most popular free children’s apps, we found that a majority are potentially in violation of COPPA, mainly due to their use of thirdparty SDKs. While many of these SDKs offer configuration options to respect COPPA by disabling tracking and behavioral advertising, our data suggest that a majority of apps either do not make use of these options or incorrectly propagate them across mediation SDKs. Worse, we observed that 19% of children’s apps collect identifiers or other personally identifiable information (PII) via SDKs whose terms of service outright prohibit their use in child-directed apps. Finally, we show that efforts by Google to limit tracking through the use of a resettable advertising ID have had little success: of the 3,454 apps that share the resettable ID with advertisers, 66% transmit other, non-resettable, persistent identifiers as well, negating any intended privacy-preserving properties of the advertising ID.
Luís T. A. N. Brandão, Nicolas Christin, George Danezis and Anonymous
Available online public/governmental services requiring authentication by citizens have considerably expanded in recent years. This has hindered the usability and security associated with credential management by users and service providers. To address the problem, some countries have proposed nation-scale identification/authentication systems that intend to greatly reduce the burden of credential management, while seemingly offering desirable privacy benefits. In this paper we analyze two such systems: the Federal Cloud Credential Exchange (FCCX) in the United States and GOV.UK Verify in the United Kingdom, which altogether aim at serving more than a hundred million citizens. Both systems propose a brokered identification architecture, where an online central hub mediates user authentications between identity providers and service providers. We show that both FCCX and GOV.UK Verify suffer from serious privacy and security shortcomings, fail to comply with privacy-preserving guidelines they are meant to follow, and may actually degrade user privacy. Notably, the hub can link interactions of the same user across different service providers and has visibility over private identifiable information of citizens. In case of malicious compromise it is also able to undetectably impersonate users. Within the structural design constraints placed on these nation-scale brokered identification systems, we propose feasible technical solutions to the privacy and security issues we identified. We conclude with a strong recommendation that FCCX and GOV.UK Verify be subject to a more in-depth technical and public review, based on a defined and comprehensive threat model, and adopt adequate structural adjustments.
Computation based on genomic data is becoming increasingly popular today, be it for medical or other purposes. Non-medical uses of genomic data in a computation often take place in a server-mediated setting where the server offers the ability for joint genomic testing between the users. Undeniably, genomic data is highly sensitive, which in contrast to other biometry types, discloses a plethora of information not only about the data owner, but also about his or her relatives. Thus, there is an urgent need to protect genomic data. This is particularly true when the data is used in computation for what we call recreational non-health-related purposes. Towards this goal, in this work we put forward a framework for server-aided secure two-party computation with the security model motivated by genomic applications. One particular security setting that we treat in this work provides stronger security guarantees with respect to malicious users than the traditional malicious model. In particular, we incorporate certified inputs into secure computation based on garbled circuit evaluation to guarantee that a malicious user is unable to modify her inputs in order to learn unauthorized information about the other user’s data. Our solutions are general in the sense that they can be used to securely evaluate arbitrary functions and offer attractive performance compared to the state of the art. We apply the general constructions to three specific types of genomic tests: paternity, genetic compatibility, and ancestry testing and implement the constructions. The results show that all such private tests can be executed within a matter of seconds or less despite the large size of one’s genomic data.
 W. Almesberger. TCP connection passing. In Linux Symposium , volume 1, July 2004.
 A. Aurelius, C. Lagerstedt, and M. Kihl. Streaming media over the Internet: Flow based analysis in live access networks. In Broadband Multimedia Systems and Broadcasting, 2011 IEEE International Symposium on , 2011.
 J. Boyan. The Anonymizer: Protecting user privacy on the web. Computer-Mediated Communication Magazine , 4(9), Sept. 1997.
 C. Brubaker, A. Houmansadr, and V. Shmatikov. Cloud-Transport: Using cloud storage for
Christoph Bösch, Benjamin Erb, Frank Kargl, Henning Kopp and Stefan Pfattheicher
 N. B. Ellison, C. Steinfield, and C. Lampe, “The benefits of facebook "friends:" social capital and college students’ use of online social network sites,” Journal of Computer- Mediated Communication, vol. 12, no. 4, pp. 1143-1168, 2007.
 R. H. Fazio, “Multiple processes by which attitudes guide behavior: The MODE model as an integrative framework,” Advances in Experimental Social Psychology, vol. 23, pp. 75-109, 1990.
 L. Festinger, A theory of cognitive dissonance. Stanford university press, 1962
 C. Kam, J. Wilking, and E. Zechmeister. Beyond the “narrow data base”: Another convenience sample for experimental research. Political Behavior, 29(4):415-440, Dec. 2007.
 A. Kavanaugh, J. Carroll, M. Rosson, T. Zin, and D. Reese. Community networks: Where offline communities meet online. Journal of Computer-Mediated Communication, 10(4), 2005.
 P. Kelley, L. F. Cranor, and N. Sadeh. Privacy as part of the app decision-making process. In Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI), pages
Yaoqi Jia, Guangdong Bai, Prateek Saxena and Zhenkai Liang
. Madhyastha. Lastor: A lowlatency as-aware tor client. In IEEE S&P, 2012.
 R. Annessi and M. Schmiedecker. Navigator: Finding faster paths to anonymity. In IEEE Euro S&P, 2016.
 K. Bauer, D. McCoy, D. Grunwald, and D. Sicker. Bitblender: Light-weight anonymity for bittorrent. In AIPACa, 2008.
 P. Boucher, A. Shostack, and I. Goldberg. Freedom systems 2.0 architecture. Zero Knowledge Systems, Inc, 2000.
 J. Boyan. The anonymizer: Protecting user privacy on the web. Computer-Mediated Communication
paradox’ in the social web: The impact of privacy concerns, individual characteristics, and the perceived social relevance on different forms of selfdisclosure. Journal of Computer Mediated Communication, 19(2):248, 2013.
 TNS Opinion & Social. Attitudes on data protection and electronic identity in the European Union, 2011. http://ec.europa.eu/public_opinion/archives/ebs/ebs_359_en.pdf [Accessed: 29-Nov-2015].
 TNS Opinion & Social. Data protection, 2015. http://ec.europa.eu/public_opinion/archives/ebs/ebs_431_sum
‘privacy paradox’in the social web: The impact of privacy concerns, individual characteristics, and the perceived social relevance on different forms of self-disclosure. Journal of Computer-Mediated Communication , 19(2):248–273, 2014.
 S. Utz and N. Kramer. The privacy paradox on social network sites revisited: The role of individual characteristics and group norms. Cyberpsychology: Journal of Psychosocial Research on Cyberspace , 3(2):2, 2009.
 A. F. Westin. Privacy and freedom. Washington and Lee Law Review , 25(1):166, 1968.
 A. W
Tadayoshi Kohno. 2014. In situ with bystanders of augmented reality glasses: Perspectives on recording and privacy-mediating technologies. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems. ACM, 2377-2386.
 Paul Dourish. 2004. What we talk about when we talk about context. Personal and ubiquitous computing 8, 1 (2004), 19-30.
 Travis Dunlap. 2009. We’ve got our eyes on you: When surveillance by unmanned aircraft systems constitutes a Fourth Amendment search. S. Tex. L. Rev. 51 (2009), 173