“What if?” Predicting Individual Users’ Smart Home Privacy Preferences and Their Changes

Natã M. Barbosa 1 , Joon S. Park 2 , Yaxing Yao 3  and Yang Wang 4
  • 1 Syracuse University,
  • 2 Syracuse University,
  • 3 Syracuse University,
  • 4 Syracuse University,


Smart home devices challenge a long-held notion that the home is a private and protected place. With this in mind, many developers market their products with a focus on privacy in order to gain user trust, yet privacy tensions arise with the growing adoption of these devices and the risk of inappropriate data practices in the smart home (e.g., secondary use of collected data). Therefore, it is important for developers to consider individual user preferences and how they would change under varying circumstances, in order to identify actionable steps towards developing user trust and exercising privacy-preserving data practices. To help achieve this, we present the design and evaluation of machine learning models that predict (1) personalized allow/deny decisions for different information flows involving various attributes, purposes, and devices (AUC .868), (2) what circumstances may change original decisions (AUC .899), and (3) how much (US dollars) one may be willing to pay or receive in exchange for smart home privacy (RMSE 12.459). We show how developers can use our models to derive actionable steps toward privacy-preserving data practices in the smart home.

If the inline PDF is not rendering correctly, you can download the PDF file here.

  • [1] Alessandro Acquisti. 2002. Protecting privacy with economics: Economic incentives for preventive technologies in ubiquitous computing environments. In Proceedings of Workshop on Socially-informed Design of Privacy-enhancing Solutions, UbiComp 2002.

  • [2] Alessandro Acquisti, Idris Adjerid, Rebecca Balebako, Laura Brandimarte, Lorrie Faith Cranor, Saranga Komanduri, Pedro Giovanni Leon, Norman Sadeh, Florian Schaub, Manya Sleeper, et al. 2017. Nudges for privacy and security: Understanding and assisting users’ choices online. CSUR 50, 3 (2017), 44.

  • [3] Alessandro Acquisti, Laura Brandimarte, and George Loewenstein. 2015. Privacy and human behavior in the age of information. Science 347, 6221 (2015), 509–514.

  • [4] Alessandro Acquisti, Leslie K John, and George Loewenstein. 2013. What is privacy worth? The Journal of Legal Studies 42, 2 (2013), 249–274.

  • [5] Noah Apthorpe, Yan Shvartzshnaider, Arunesh Mathur, Dillon Reisman, and Nick Feamster. 2018. Discovering smart home internet of things privacy norms using contextual integrity. Proceedings of IMWUT 2, 2 (2018), 59.

  • [6] Paritosh Bahirat, Yangyang He, Abhilash Menon, and Bart Knijnenburg. 2018. A data-driven approach to developing IoT privacy-setting interfaces. In IUI 2018. ACM, 165–176.

  • [7] Eun Kyoung Choe, Sunny Consolvo, Jaeyeon Jung, Beverly Harrison, and Julie A Kientz. 2011. Living in a glass house: a survey of private moments in the home. In Proceedings of UbiComp 2011. ACM, 41–44.

  • [8] CIPR 2017. CIPR - Home automation device market grows briskly. (2017). https://www.voicebot.ai/wp-content/uploads/2017/11/cirp-news-release-2017-11-06-echo-home.pdf.

  • [9] Bogdan Copos, Karl Levitt, Matt Bishop, and Jeff Rowe. 2016. Is anybody home? Inferring activity from smart home network traffic. In SPW, 2016. IEEE, 245–251.

  • [10] Karen L Courtney. 2008. Privacy and senior willingness to adopt smart home information technology in residential care facilities. Methods of Information in Medicine 47, 01 (2008), 76–81.

  • [11] Karen L Courtney, George Demeris, Marilyn Rantz, and Marjorie Skubic. 2008. Needing smart home technologies: the perspectives of older adults in continuing care retirement communities. (2008).

  • [12] Anupam Das, Martin Degeling, Daniel Smullen, and Norman Sadeh. 2018. Personalized privacy assistants for the internet of things: providing users with notice and choice. IEEE Pervasive Computing 17, 3 (2018), 35–46.

  • [13] George Demiris, Brian K Hensel, Marjorie Skubic, and Marilyn Rantz. 2008. Senior residents’ perceived need of and preferences for “smart home” sensor technologies. International Journal of Technology Assessment in Health Care 24, 1 (2008), 120–124.

  • [14] Jens Grossklags and Alessandro Acquisti. 2007. When 25 cents is too much: An experiment on willingness-to-sell and willingness-to-protect personal information. In WEIS 2007.

  • [15] Jason Hong. 2017. The privacy landscape of pervasive computing. IEEE Pervasive Computing 16, 3 (2017), 40–48.

  • [16] Xiaodong Jiang, Jason I Hong, and James A Landay. 2002. Approximate information flows: Socially-based modeling of privacy in ubiquitous computing. In UbiComp 2002. Springer, 176–193.

  • [17] Juniper 2017. Juniper - digital voice assistants. (2017). https://www.juniperresearch.com/researchstore/innovation-disruption/digital-voice-assistants/platforms-revenues-opportunities.

  • [18] Bart P Knijnenburg, Alfred Kobsa, and Hongxia Jin. 2013. Dimensionality of information disclosure behavior. International Journal of Human-Computer Studies 71, 12 (2013), 1144–1162.

  • [19] Scott Lederer, Jennifer Mankoff, and Anind K Dey. 2003. Who wants to know what when? privacy preference determinants in ubiquitous computing. In CHI’03 extended abstracts on Human factors in computing systems. ACM, 724–725.

  • [20] Hosub Lee and Alfred Kobsa. 2016. Understanding user privacy in Internet of Things environments. In IEEE WF-IoT 2016. 407–412.

  • [21] Jialiu Lin, Shahriyar Amini, Jason I Hong, Norman Sadeh, Janne Lindqvist, and Joy Zhang. 2012. Expectation and purpose: understanding users’ mental models of mobile app privacy through crowdsourcing. In Proceedings of UbiComp 2012. ACM, 501–510.

  • [22] Bin Liu, Mads Schaarup Andersen, Florian Schaub, Hazim Almuhimedi, Shikun Aerin Zhang, Norman Sadeh, Yuvraj Agarwal, and Alessandro Acquisti. 2016. Follow my recommendations: A personalized privacy assistant for mobile app permissions. In SOUPS 2016. 27–41.

  • [23] Bernard Lubin and Roger L Harrison. 1964. Predicting small group behavior with the self-disclosure inventory. Psychological Reports 15, 1 (1964), 77–78.

  • [24] Naresh K Malhotra, Sung S Kim, and James Agarwal. 2004. Internet users’ information privacy concerns (IUIPC): The construct, the scale, and a causal model. Information Systems Research 15, 4 (2004), 336–355.

  • [25] Emily McReynolds, Sarah Hubbard, Timothy Lau, Aditya Saraf, Maya Cakmak, and Franziska Roesner. 2017. Toys that listen: A study of parents, children, and internet-connected toys. In Proceedings of CHI 2017. 5197–5207.

  • [26] William Melicher, Mahmood Sharif, Joshua Tan, Lujo Bauer, Mihai Christodorescu, and Pedro Giovanni Leon. 2016. (Do Not) Track me sometimes: users’ contextual preferences for web tracking. PETS 2016, 2 (2016), 135–154.

  • [27] Pardis Emami Naeini, Sruti Bhagavatula, Hana Habib, Martin Degeling, Lujo Bauer, Lorrie Cranor, and Norman Sadeh. 2017. Privacy expectations and preferences in an IoT world. In SOUPS 2017.

  • [28] Helen Nissenbaum. 2004. Privacy as contextual integrity. Wash. L. Rev. 79 (2004), 119.

  • [29] Katarzyna Olejnik, Italo Dacosta, Joana Soares Machado, Kévin Huguenin, Mohammad Emtiyaz Khan, and Jean-Pierre Hubaux. 2017. SmarPer: Context-aware and automatic runtime-permissions for mobile devices. In IEEE SP 2017. 1058–1076.

  • [30] Leysia Palen and Paul Dourish. 2003. Unpacking privacy for a networked world. In CHI 2003. ACM, 129–136.

  • [31] Pew 2 2017. Pew Research Center, The internet of things connectivity binge: what are the implications? (2017). http://www.pewinternet.org/2017/06/06/theme-3-risk-is-part-of-life-the-internet-of-things-will-be-accepted-despite-dangers-because-most-people-believe-the-worst-case-scenario-would-never-happen-to-them.

  • [32] Yu Pu and Jens Grossklags. 2015. Using conjoint analysis to investigate the value of interdependent privacy in social app adoption scenarios. (2015).

  • [33] Yu Pu and Jens Grossklags. 2016. Towards a model on the factors influencing social app users’ valuation of interdependent privacy. PETS 2016, 2 (2016), 61–81.

  • [34] Yu Pu and Jens Grossklags. 2017. Valuating friends’ privacy: Does anonymity of sharing personal data matter?. In SOUPS 2017. 339–355.

  • [35] Joel Ross, Lilly Irani, M Silberman, Andrew Zaldivar, and Bill Tomlinson. 2010. Who are the crowdworkers? shifting demographics in mechanical turk. In CHI’10 extended abstracts on Human factors in computing systems. ACM, 2863–2872.

  • [36] Daniel J Solove. 2005. A taxonomy of privacy. U. Pa. L. Rev. 154 (2005), 477.

  • [37] Blase Ur, Pedro Giovanni Leon, Lorrie Faith Cranor, Richard Shay, and Yang Wang. 2012. Smart, useful, scary, creepy: perceptions of online behavioral advertising. In Proceedings of SOUPS 2012. ACM, 4.

  • [38] Max Van Kleek, Reuben Binns, Jun Zhao, Adam Slack, Sauyon Lee, Dean Ottewell, and Nigel Shadbolt. 2018. X-ray refine: Supporting the exploration and refinement of information exposure resulting from smartphone apps. In Proceedings of CHI 2018. ACM, 393.

  • [39] Primal Wijesekera, Arjun Baokar, Lynn Tsai, Joel Reardon, Serge Egelman, David Wagner, and Konstantin Beznosov. 2017. The feasibility of dynamically granted permissions: Aligning mobile privacy with user preferences. In IEEE SP 2017. 1077–1093.

  • [40] Primal Wijesekera, Joel Reardon, Irwin Reyes, Lynn Tsai, Jung-Wei Chen, Nathan Good, David Wagner, Konstantin Beznosov, and Serge Egelman. 2018. Contextualizing privacy decisions for better prediction (and protection). In Proceedings of CHI 2018. ACM, 268.

  • [41] Peter Worthy, Ben Matthews, and Stephen Viller. 2016. Trust me: doubts and concerns living with the internet of things. In Proceedings of DIS 2016. ACM, 427–434.


Journal + Issues