Search Results

1 - 3 of 3 items :

  • Author: Yang Yang x
  • IT-Security and Cryptology x
  • Computer Sciences, other x
Clear All Modify Search
ErasuCrypto: A Light-weight Secure Data Deletion Scheme for Solid State Drives

Abstract

Securely deleting invalid data from secondary storage is critical to protect users’ data privacy against unauthorized accesses. However, secure deletion is very costly for solid state drives (SSDs), which unlike hard disks do not support in-place update. When applied to SSDs, both erasure-based and cryptography-based secure deletion methods inevitably incur large amount of valid data migrations and/or block erasures, which not only introduce extra latency and energy consumption, but also harm SSD lifetime.

This paper proposes ErasuCrypto, a light-weight secure deletion framework with low block erasure and data migration overhead. ErasuCrypto integrates both erasurebased and encryption-based data deletion methods and flexibly selects the more cost-effective one to securely delete invalid data. We formulate a deletion cost minimization problem and give a greedy heuristic as the starting point. We further show that the problem can be reduced to a maximum-edge biclique finding problem, which can be effectively solved with existing heuristics. Experiments on real-world benchmarks show that ErasuCrypto can reduce the secure deletion cost of erasurebased scheme by 71% and the cost of cryptographybased scheme by 37%, while guaranteeing 100% security by deleting all the invalid data.

Open access
Flying Eyes and Hidden Controllers: A Qualitative Study of People’s Privacy Perceptions of Civilian Drones in The US

Abstract

Drones are unmanned aircraft controlled remotely or operated autonomously. While the extant literature suggests that drones can in principle invade people’s privacy, little is known about how people actually think about drones. Drawing from a series of in-depth interviews conducted in the United States, we provide a novel and rich account of people’s privacy perceptions of drones for civilian uses both in general and under specific usage scenarios. Our informants raised both physical and information privacy issues against government, organization and individual use of drones. Informants’ reasoning about the acceptance of drone use was in part based on whether the drone is operating in a public or private space. However, our informants differed significantly in their definitions of public and private spaces. While our informants’ privacy concerns such as surveillance, data collection and sharing have been raised for other tracking technologies such as camera phones and closed-circuit television (CCTV), our interviews highlight two heightened issues of drones: (1) powerful yet inconspicuous data collection, (2) hidden and inaccessible drone controllers. These two aspects of drones render some of people’s existing privacy practices futile (e.g., notice recording and ask controllers to stop or delete the recording). Some informants demanded notifications of drones near them and expected drone controllers asking for their explicit permissions before recording. We discuss implications for future privacy-enhancing drone designs.

Open access
“What if?” Predicting Individual Users’ Smart Home Privacy Preferences and Their Changes

Abstract

Smart home devices challenge a long-held notion that the home is a private and protected place. With this in mind, many developers market their products with a focus on privacy in order to gain user trust, yet privacy tensions arise with the growing adoption of these devices and the risk of inappropriate data practices in the smart home (e.g., secondary use of collected data). Therefore, it is important for developers to consider individual user preferences and how they would change under varying circumstances, in order to identify actionable steps towards developing user trust and exercising privacy-preserving data practices. To help achieve this, we present the design and evaluation of machine learning models that predict (1) personalized allow/deny decisions for different information flows involving various attributes, purposes, and devices (AUC .868), (2) what circumstances may change original decisions (AUC .899), and (3) how much (US dollars) one may be willing to pay or receive in exchange for smart home privacy (RMSE 12.459). We show how developers can use our models to derive actionable steps toward privacy-preserving data practices in the smart home.

Open access