Project A6

Understandable Privacy Specification

Principal Investigators

Antonio Krüger

krueger@dfki.de

Project Summary

It is well known that, in current digital user habitats, especially in online social networks, users experience severe difficulties in understanding and specifying their privacy settings. Two key reasons for this are that: (1) The possible settings are typically technically motivated, not allowing to accurately reflect users’ actual priorities. (2) It is typically very complex for users to understand the consequences of their settings, and thus to configure the settings to suit their needs. As a result, users often stick to the default settings even though they do not necessarily match their personal preferences. Within the project, we developed a combination of techniques suitable to address these difficulties for some exemplary domains. First, (1) has been addressed through advancing user-oriented models, and a novel combination of two different kinds of such models. We modelled the user’s personality and privacy attitudes, capturing the user’s needs and preferences in a generic manner that can be adapted to all of the aforementioned domains, enabling the system to explicitly refer to and reason about the user’s individual view. Furthermore, we developed a new user interface that visualizes the privacy rules and allow the users to (a) easily get an overview on the current privacy state (b) detect possible privacy leaks or misconfigurations and (c) allow the user to review, adapt and fix the privacy settings. These components approach privacy from different perspectives (user vs. system), and mutually benefit from each other through a user feedback cycle, including the possibility for users to conveniently give in-situ feedback while witnessing unwanted consequences of information sharing, based on wearable computing technology making them aware of the final audience during the post.
The overall goal was to put users in the position to effortlessly understand their exposure and help them assess in quantitative and qualitative terms the privacy consequences of their actions. This was ensured through a user-centered design process, including extensive lab user studies. While our basic methodology and ideas apply to arbitrary privacy-relevant information in principle, different types of privacy-relevant information naturally differ widely in the specifics, i.e., in the required model attributes, meaningful privacy visualisations, suitable forms of in-situ feedback, etc. Addressing all of these different aspects is far beyond scope of any single research project, and would be more distracting than useful. We therefore focused on four different domains where user privacy plays an important role, namely location sharing, social media posts, mobile app permission settings, and sensitive data captured from an intelligent retail store, which tracks actions like the customers’ movements through the store, as well as viewed or bought products. These were highly relevant use cases in its own right. As most of the concepts, and many of the technologies, we developed a general-purpose in principle. It can be expected that our work will yield useful insights for other types of privacy-relevant information as well. The enforcement of privacy policies, i.e. allowing/disallowing information sharing according to the user’s rules, was implemented prototypically on the social network platform developed within this project.

Role Within the Collaborative Research Center

cdA6