Skip to main content
eScholarship
Open Access Publications from the University of California

UC Irvine

UC Irvine Electronic Theses and Dissertations bannerUC Irvine

A User-Tailored Approach to Privacy Decision Support

Creative Commons 'BY-NC' version 4.0 license
Abstract

As an increasingly important part of our social, professional and financial lives happens online, the frequency with which we have to deal with privacy problems is ever on the rise. In this dissertation I answer the question: How can we help users to balance the benefits and risks of information disclosure in a user-friendly manner, so that they can make good privacy decisions?

After briefly motivating this question in Chapter 1, I first discuss problems with existing answers to this question in Chapters 2 and 3. In Chapter 2, I explain how providing transparency and control does sufficiently not help users in making better privacy decisions. Specifically, I demonstrate that people’s privacy decisions fall prey to all sorts of decision biases, and that most privacy decisions are too complex for people to fathom. In effect, many people refrain from exploiting the provided transparency and control altogether.

In Chapter 3, I explain how “privacy nudging” is also not sufficient in its presently studied form. Specifically, I demonstrate that although nudges relieve some of the burden of privacy decision making, they tend to overlook the inherent diversity of users’ privacy preferences and the context-dependency of their decisions.

The main argument of this dissertation is that because of these shortcomings of transparency-and-control and privacy nudges, privacy scholars need to move beyond the “one-size-fits-all” approach to privacy embodied in both nudges and transparency and control. I argue that because of the high variability and context-dependency of people’s privacy decisions, nudges need to be tailored to the user and her context.

In several studies, I contextualize users’ privacy decisions by showing how disclosure depended on the person’s privacy profile, the type of information, and the recipient of the information (Chapter 4). Then, I present the idea of a “privacy adaptation procedure” and demonstrate its merit in Chapter 5. Finally, I test a complete implementation of the privacy adaptation procedure in Chapter 6. The results of this final study cause reserved optimism regarding the feasibility of user-tailored privacy decision support.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View