Multiple Purposes, Multiple Problems: A User Study of Consent Dialogs after GDPR

Dominique Machuletz 1  and Rainer Böhme 2
  • 1 University of Münster, , Germany
  • 2 University of Innsbruck, , Austria


The European Union’s General Data Protection Regulation (GDPR) requires websites to ask for consent to the use of cookies for specific purposes. This enlarges the relevant design space for consent dialogs. Websites could try to maximize click-through rates and positive consent decision, even at the risk of users agreeing to more purposes than intended. We evaluate a practice observed on popular websites by conducting an experiment with one control and two treatment groups (N = 150 university students in two countries). We hypothesize that users’ consent decision is influenced by (1) the number of options, connecting to the theory of choice proliferation, and (2) the presence of a highlighted default button (“select all”), connecting to theories of social norms and deception in consumer research. The results show that participants who see a default button accept cookies for more purposes than the control group, while being less able to correctly recall their choice. After being reminded of their choice, they regret it more often and perceive the consent dialog as more deceptive than the control group. Whether users are presented one or three purposes has no significant effect on their decisions and perceptions. We discuss the results and outline policy implications.

If the inline PDF is not rendering correctly, you can download the PDF file here.

  • [1] European Parliament and the Council of the European Union. Regulation (EU) 2016/679 of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (2016)

  • [2] European Parliament and the Council of the European Union. Directive 2002/58/EC of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications) (2002)

  • [3] European Parliament and the Council of the European Union. Directive 2009/136/EC of 25 November 2009 amending Directive 2002/22/EC universal service and users’ rights relating to electronic communications networks and services, Directive 2002/58/EC concerning the processing of personal data and the protection of privacy in the electronic communications sector and Regulation (EC) No 2006/2004 on cooperation between national authorities responsible for the enforcement of consumer protection laws. (2009)

  • [4] R. Leenes, E. Kosta. Taming the cookie monster with Dutch law — A tale of regulatory failure. Computer Law & Security Review (2015) 31, 3, 317–335

  • [5] M. Trevisan, B. E. Traverso, Stefano, M. Mellia. 4 years of EU cookie law: Results and lessons learned. In: Proceedings on Privacy Enhancing Technologies (PoPETs) (De Gruyter Open, 2019) 126–145

  • [6] R. van Eijk, H. Asghari, P. Winter, A. Narayanan. The impact of user location on cookie notices (inside and outside of the European Union). In: Workshop on Technology and Consumer Protection (ConPro) (2019)

  • [7] S. Englehardt, A. Narayanan. Online tracking: A 1-million-site measurement and analysis. In: Conference on Computer and Communications Security (CCS) (ACM, 2016) 1388–1401

  • [8] M. Degeling, C. Utz, C. Lentzsch, H. Hosseini, F. Schaub, T. Holz. Measuring the GDPR’s impact on web privacy. In: Network and Distributed System Security Symposium (NDSS) (Internet Society, 2019)

  • [9] C. Utz, M. Degeling, S. Fahl, F. Schaub, T. Holz. (Un)informed consent: Studying GDPR consent notices in the field. In: Conference on Computer and Communications Security (CCS) (ACM, 2019) 973–990

  • [10] I. Sánchez-Rola, M. Dell’Amico, P. Kotzias, D. Balzarotti, L. Bilge, P. Vervier, et al. Can I opt out yet?: GDPR and the global illusion of cookie control. In: Conference on Computer and Communications Security (AsiaCCS) (ACM, 2019) 340–351

  • [11] J. R. Kling, S. Mullainathan, E. Shafir, L. Vermeulen, M. V. Wrobel. Misperception in choosing medicare drug plans. Harvard University working paper (2008)

  • [12] H. Cronqvist, R. H. Thaler. Design choices in privatized social-security systems: Learning from the Swedish experience. American Economic Review (2004) 94, 2, 424—428

  • [13] S. Korff, R. Böhme. Too much choice: End-user privacy decisions in the context of choice proliferation. In: Symposium On Usable Privacy and Security (SOUPS) (USENIX, 2014) 69–87

  • [14] European Commission. The GDPR: New opportunities, new obligations. Tech. rep., Publications Office of the European Union, Brussels, Luxembourg (2018)

  • [15] European Court of Justice. Judgement in Case C-673/17 in the proceedings Bundesverband der Verbraucherzentralen und Verbraucherverbände vs Planet49 GmbH (2019)

  • [16] Cybot A/S. WordPress and GDPR, and how to deal with cookies and plugins. Copenhagen, Denmark (2019).

  • [17] M. Ackerman, L. F. Cranor, J. Reagle. Privacy in e-commerce: Examining user scenarios and privacy preferences. In: Conference on Electronic Commerce (EC) (ACM, 1999) 1–8

  • [18] L. I. Millett, B. Friedman, E. Felten. Cookies and web browser design: Toward realizing informed consent online. In: Conference on Human Factors in Computing System (CHI) (ACM, 2001) 46–52

  • [19] F. Schaub, R. Balebako, A. L. Durity, L. F. Cranor. A design space for effective privacy notices. In: Symposium On Usable Privacy and Security (SOUPS) (USENIX, 2015) 1–17

  • [20] M. Bergmann. Generic predefined privacy preferences for online applications. In: IFIP International Summer School on the Future of Identity in the Information Society (Springer, 2007) 259–273

  • [21] J. S. Pettersson, S. Fischer-Hubner, M. C. Mont, S. Pearson. How ordinary internet users can have a chance to influence privacy policies. In: Nordic conference on Human-computer interaction: Changing roles (NordiCHI) (ACM, 2006) 473–476

  • [22] J. S. Pettersson, S. Fischer-Hübner, N. Danielsson, J. Nilsson, M. Bergmann, S. Clauss, et al. Making PRIME usable. In: Symposium on Usable Privacy and Security (SOUPS) (ACM, 2005) 53–64

  • [23] J. J. Borking. Privacy incorporated software agent (PISA): Proposal for building a privacy guardian for the electronic age. In: Designing Privacy Enhancing Technologies: International Workshop on Design Issues in Anonymity and Unobservability (Springer, 2001) 130–140

  • [24] M. Bergmann. Testing privacy awareness. In: IFIP Summer School on the Future of Identity in the Information Society (Springer, 2008) 237–253

  • [25] L. F. Cranor. P3P: Making privacy policies more useful. Security & Privacy (2003) 99, 6, 50–55

  • [26] M. Langheinrich, L. Cranor, M. Marchiori. Appel: A P3P preference exchange language. W3C Working Draft (2002)

  • [27] M.-R. Ulbricht, F. Pallas. YaPPL – a lightweight privacy preference language for legally sufficient and automated consent provision in IoT scenarios. In: J. García-Alfaro, J. Herrera-Joancomartí, G. Livraga, R. Rios (Eds.) Data Privacy Management, Cryptocurrencies and Blockchain Technology (Springer, 2018), no. 11025 in Lecture Notes in Computer Science, 329–344

  • [28] T. Vila, R. Greenstadt, D. Molnar. Why we can’t be bothered to read privacy policies. In: Economics of Information Security (Springer, 2004), 143–153

  • [29] J. Grossklags, N. Good. Empirical studies on software notices to inform policy makers and usability designers. In: Financial Cryptography and Data Security (FC) (Springer, 2007) 341–355

  • [30] O. Kulyk, A. Hilt, N. Gerber, M. Volkamer. Users’ perceptions and reactions to the cookie disclaimer. In: European Workshop on Usable Security (EuroUSEC) (2018)

  • [31] R. Böhme, S. Köpsell. Trained to accept?: A field experiment on consent dialogs. In: Conference on Human Factors in Computing System (CHI) (ACM, 2010) 2403–2406

  • [32] A. P. Felt, S. Egelman, M. Finifter, D. Akhawe, D. A. Wagner, et al. How to ask for permission. HotSec (2012)

  • [33] S. Spiekermann, A. Acquisti, R. Böhme, K. L. Hui. The challenges of personal data markets and privacy. Electronic Markets (2015) 25, 2, 161–167

  • [34] B. Scheibehenne, R. Greifeneder, P. M. Todd. Can there ever be too many options? A meta-analytic review of choice overload. Journal of Consumer Research (2010) 37, 3, 409–425

  • [35] E. J. Johnson, S. B. Shu, B. G. Dellaert, C. Fox, D. G. Goldstein, G. Häubl, et al. Beyond nudges: Tools of a choice architecture. Marketing Letters (2012) 23, 2, 487–504

  • [36] B. P. Knijnenburg, A. Kobsa, H. Jin. Preference-based location sharing: Are more privacy options really better? In: Conference on Human Factors in Computing System (CHI) (ACM, 2013) 2667–2676

  • [37] K. Tang, J. Hong, D. Siewiorek. The implications of offering more disclosure choices for social location sharing. In: Conference on Human Factors in Computing System (CHI) (ACM, 2012) 391–394

  • [38] H. Krasnova, N. Eling, O. Schneider, H. Wenninger, T. Widjaja, P. Buxmann, et al. Does this app ask for too much data? The role of privacy perceptions in user behavior towards Facebook applications and permission dialogs. In: European Conference on Information Systems (ECIS) (2013)

  • [39] C. Anderson. The long tail: Why the future of business is selling less of more (Hachette Books, London, UK, 2006)

  • [40] J. M. Hutchinson. Is more choice always desirable? Evidence and arguments from leks, food selection, and environmental enrichment. Biological Reviews (2005) 80, 1, 73–92

  • [41] R. Böhme, J. Grossklags. The security cost of cheap user interaction. In: New Security Paradigms Workshop (NSPW) (ACM, 2011) 67–82

  • [42] S. Egelman. My profile is my password, verify me!: The privacy/convenience tradeoff of Facebook Connect. In: Conference on Human Factors in Computing System (CHI) (ACM, 2013) 2369–2378

  • [43] P. E. Johnson, S. Grazioli, K. Jamal, R. G. Berryman. Detecting deception: Adversarial problem solving in a low baserate world. Cognitive Science (2001) 25, 3, 355–392

  • [44] S. Román. Relational consequences of perceived deception in online shopping: The moderating roles of type of product, consumer’s attitude toward the internet and consumer’s demographics. Journal of Business Ethics (2010) 95, 3, 373–391

  • [45] B. Xiao, I. Benbasat. Product-related deception in e-commerce: A theoretical perspective. MIS Quarterly (2011) 35, 1, 169–196

  • [46] A. Nochenson, J. Grossklags. An online experiment on consumers’ susceptibility to fall for post-transaction marketing scams. In: European Conference on Information Systems (ECIS) (Association for Information Systems, 2014)

  • [47] D. M. Boush, M. Friestad, P. Wright. Deception in the marketplace: The psychology of deceptive persuasion and consumer self-protection (Routledge/Taylor & Francis Group, 2015)

  • [48] P. Fleming, S. C. Zyglidopoulos. The escalation of deception in organizations. Journal of Business Ethics (2008) 81, 4, 837–850

  • [49] K. A. Jehn, E. D. Scott. Perceptions of deception: Making sense of responses to employee deceit. Journal of Business Ethics (2008) 80, 2, 327–347

  • [50] K. Yoon, K. Knight, D. Martin. Deceiving team members about competence: Its motives and consequences. Western Journal of Communication (2018) 1–22

  • [51] B. Shneiderman, M. Leavitt, et al. Research-based web design & usability guidelines (Department of Health and Human Services, Washington, DC, 2006)

  • [52] J. Watson, H. R. Lipford, A. Besmer. Mapping user preference to privacy default settings. Transactions on Computer-Human Interaction (TOCHI) (ACM, 2015) 22, 32

  • [53] C. I. Hovland, I. L. Janis, H. H. Kelley. Communication and Persuasion: Psychological Studies of Opinion Change (Greenwood Press, 1953)

  • [54] C. M. Gray, Y. Kou, B. Battles, J. Hoggatt, A. L. Toombs. The dark (patterns) side of UX design. In: Conference on Human Factors in Computing System (CHI) (ACM, 2018) 534:1–14

  • [55] H. Brignull. Dark patterns. Tech. rep. (2019).

  • [56] C. Bösch, B. Erb, F. Kargl, H. Kopp, S. Pfattheicher. Tales from the dark side: Privacy dark strategies and privacy dark patterns (De Gruyter Open, 2016), vol. 2016 237–254

  • [57] A. M. McDonald, L. F. Cranor. The cost of reading privacy policies. I/S: J. L (2008) 4, 3, 540–565

  • [58] A. Mathur, G. Acar, M. J. Friedman, E. Lucherini, J. Mayer, M. Chetty, et al. Dark patterns at scale: Findings from a crawl of 11K shopping websites. Proceedings of the ACM on Human-Computer Interaction (2019) 3, 81

  • [59] J. M. Bland, D. G. Altman. Cronbach’s alpha. British Medical Journal (1997) 314, 7080, 570–572

  • [60] G. A. Miller. The magic number seven, plus or minus two: Some limits on our capacity for processing information. Psychological Review (1956) 63, 2, 81–97

  • [61] P. A. Norberg, D. R. Horne, D. A. Horne. The privacy paradox: Personal information disclosure intentions versus behaviors. Journal of Consumer Affairs (2007) 41, 1, 100–126

  • [62] S. S. Sundar, H. Kang, M. Wu, E. Go, B. Zhang. Unlocking the privacy paradox: Do cognitive heuristics hold the key? In: Extended Abstracts on Human Factors in Computing Systems (CHI EA) (ACM, 2013), 6 811–816

  • [63] N. Gerber, P. Gerber, M. Volkamer. Explaining the privacy paradox: A systematic review of literature investigating privacy attitude and behavior. Computers & Security (2018) 77, 226–261

  • [64] I. Ajzen. Models of human social behavior and their application to health psychology. Psychology and Health (1998) 13, 4, 735–739

  • [65] T. Dienlin, S. Trepte. Is the privacy paradox a relic of the past? An in-depth analysis of privacy attitudes and privacy behaviors. European Journal of Social Psychology (2015) 45, 3, 285–297

  • [66] M. T. Orne. On the social psychology of the psychological experiment: With particular reference to demand characteristics and their implications. American psychologist (1962) 17, 11, 776–783

  • [67] J. P. Walsh, S. Kiesler, L. S. Sproull, B. W. Hesse. Self-selected and randomly selected respondents in a computer network survey. Public Opinion Quarterly (1992) 56, 2, 241–244

  • [68] H. Cho, R. LaRose. Privacy issues in internet surveys. Social Science Computer Review (1999) 17, 4, 421–434

  • [69] European Union Agency for Network and Information Security. Challenges and opportunities for EU cybersecurity start-ups (2019)

  • [70] L. Olejni. A second life for the ‘do not track’ setting – with teeth. Wired (2019)


Journal + Issues