Search Results

You are looking at 1 - 2 of 2 items for

  • Author: Yaxing Yao x
Clear All Modify Search
Open access

Yang Wang, Huichuan Xia, Yaxing Yao and Yun Huang

Abstract

Drones are unmanned aircraft controlled remotely or operated autonomously. While the extant literature suggests that drones can in principle invade people’s privacy, little is known about how people actually think about drones. Drawing from a series of in-depth interviews conducted in the United States, we provide a novel and rich account of people’s privacy perceptions of drones for civilian uses both in general and under specific usage scenarios. Our informants raised both physical and information privacy issues against government, organization and individual use of drones. Informants’ reasoning about the acceptance of drone use was in part based on whether the drone is operating in a public or private space. However, our informants differed significantly in their definitions of public and private spaces. While our informants’ privacy concerns such as surveillance, data collection and sharing have been raised for other tracking technologies such as camera phones and closed-circuit television (CCTV), our interviews highlight two heightened issues of drones: (1) powerful yet inconspicuous data collection, (2) hidden and inaccessible drone controllers. These two aspects of drones render some of people’s existing privacy practices futile (e.g., notice recording and ask controllers to stop or delete the recording). Some informants demanded notifications of drones near them and expected drone controllers asking for their explicit permissions before recording. We discuss implications for future privacy-enhancing drone designs.

Open access

Natã M. Barbosa, Joon S. Park, Yaxing Yao and Yang Wang

Abstract

Smart home devices challenge a long-held notion that the home is a private and protected place. With this in mind, many developers market their products with a focus on privacy in order to gain user trust, yet privacy tensions arise with the growing adoption of these devices and the risk of inappropriate data practices in the smart home (e.g., secondary use of collected data). Therefore, it is important for developers to consider individual user preferences and how they would change under varying circumstances, in order to identify actionable steps towards developing user trust and exercising privacy-preserving data practices. To help achieve this, we present the design and evaluation of machine learning models that predict (1) personalized allow/deny decisions for different information flows involving various attributes, purposes, and devices (AUC .868), (2) what circumstances may change original decisions (AUC .899), and (3) how much (US dollars) one may be willing to pay or receive in exchange for smart home privacy (RMSE 12.459). We show how developers can use our models to derive actionable steps toward privacy-preserving data practices in the smart home.