The data protection design & privacy-enhancing design manifesto

and how data protection law & UX design must work together

Luiza Jarovsky
UX Collective

--

Picture of a crowd
Photo by Rob Curran on Unsplash

In today’s post, I will propose and explain a new discipline: Data Protection Design, in which the worlds of Data Protection Law and UX Design are bridged and Privacy-Enhancing Design is the gold UX Design standard. As I explain below, this is a much-needed next step for data protection legislation and data protection practices around the world. I hope to involve other privacy and data protection specialists, UX designers, product managers, lawmakers and anyone interested in building better privacy and data protection practices. I invite you to read the proposal below and join the conversation.

WHY DATA PROTECTION DESIGN?

The acknowledgment of the ubiquity of deceptive patterns in data protection and the inadequacy of privacy policies and written privacy notices to support transparency in the online environment make it clear that the current legal data protection framework is incomplete. And by that, I mean that if we do not embrace UX design as an essential data protection component, a very meaningful part of the interaction between users and organizations will be out of reach. As a result:

  • Deceptive design practices that impact personal data and fundamental values will continue flourishing;
  • Users will continue being uninformed about the rights, data practices and privacy risks involved in their online activities (as they do not read privacy policies and should not be expected to read them);
  • Organizations will continue dealing with data protection, users’ data rights and fundamental values connected to privacy in a tangential and reactive way, confined to their legal departments. Writing a legally compliant privacy policy will continue being seen as the foremost goal of a data protection strategy (regardless of users not reading it and not being aware of their rights).
  • Privacy principles and goals will continue not being an essential part of an organization’s business model, value proposition and UX design strategy. (But organizations will keep marketing that they care for privacy, because privacy sells.)

DATA PROTECTION DESIGN — A NEW DISCIPLINE

I am proposing here that it must change. The worlds of Data Protection Law and UX Design must be bridged. Data Protection Design must be a discipline in itself, in which the focus is:

  • Following privacy-enhancing design practices;
  • Translating legal data protection principles and rules into privacy-enhancing UX design practices;
  • Through an iterative process, developing principles, goals, rules and tools that should be followed by UX designers and product managers in order to implement privacy-enhancing design;
  • Developing best transparency practices to help organization publicize their UX design practices (and be held accountable when not following privacy-enhancing design principles and practices);
  • Applying design thinking and design methods to the implementation of privacy-enhancing design;
  • Involving users and users’ perspectives in the implementation of privacy-enhancing design;
  • Establishing the role of Data Protection Designers and Data Protection Design Officers (DPDO) within an organization. Defining protocols and best practices that should be followed.

PRIVACY-ENHANCING DESIGN — THE 7 PRINCIPLES

The 7 principles (or heuristics, as UX designers prefer) of Privacy-Enhancing Design are:

  1. Autonomy and Human Dignity are Central. User autonomy and human dignity are fundamental rights and must be respected throughout the UX design. The UX design must allow users to exercise their choices and preferences freely, autonomously and in an informed way. Users should not be pushed or forced to take a certain action. Users should be able to easily retract a certain choice or preference.
  2. Transparency. UX design practices should foster transparency and accessibility so that users are aware of ongoing data transactions. Every new data transaction (collection or processing) should be clearly signalized in an accessible way so that users can realize that data is being exchanged. Users should be made aware that their personal data is being collected or processed. Symbols, colors and a variety of design features might be used to transmit information.
  3. No Previous Data Protection Knowledge. UX design should presuppose that users have no background data protection knowledge. Interfaces that involve data collection and processing should be clear and accessible, with simple and user-friendly indications of the scope and extent of the data transaction, including possible risks (even if it seems obvious for the designer).
  4. Acknowledgment of Cognitive Biases. Cognitive biases must be broadly recognized and acknowledged. The exploitation of cognitive biases to collect more — or more sensitive- personal data (i.e. through deceptive patterns in data protection) must be refrained throughout the UX design process. Users should be seen as vulnerable and manipulable and it is the responsibility of the organization to shield users from manipulation.
  5. Burden on Organizations. Organizations should be responsible for designing UX interfaces that do not exploit users cognitive biases. Organizations should be able to prove — at any time — that their UX design practices are privacy-enhancing (and not privacy-harming). If users are committing errors, it is the responsibility of organizations to detect and correct the design practice that is fostering these errors.
  6. Design Accountability. Organizations should be held accountable for their design practices. Organizations should publicly publish their privacy-design practices (perhaps through a Privacy Design Policy, similar to a Privacy Policy but focused on UX design practices). It should be possible to legally question an organization on its UX design practices.
  7. Holistic implementation. The principles above must be implemented throughout the UX design and present in every interaction between users and organizations (i.e. not restricted to privacy settings). Privacy and Data Protection should be made an integral part of the interaction between organization and user.

NEXT STEPS

At this point, it would be very important to set practical standards, rules and best practices for Privacy-Enhancing Design.

I made an exercise to show what Privacy-Enhancing Design would look like in practice in this post. There, I imagined a hypothetical Facebook “user post” interface that would follow extremely privacy-enhancing premises. My goal with that exercise was to say that change towards more privacy is possible and can be made through UX design. In that exercise, I showed that some of the premises embedded into Facebook’s current “user post” interface do not reflect a privacy-enhancing framework. In any case, how the version with improved premises is going to be implemented in practice is up to the 'data protection designer' in charge (and aspects such as usability and the fluidness of the experience should also be considered).

I am a lawyer and not a designer, therefore insights from UX designers on what are the best tools and best design practices to implement privacy-enhancing design will be extremely welcome and helpful. If you are a designer and have suggestions, I invite you to read item “E” below (Join the Conversation) and get in touch.

The practical implementation of privacy-enhancing design should be a task performed by a multidisciplinary group involving data protection lawyers, data protection designers and product managers. Knowledge of both the legal and the UX design aspects of Data Protection Design and Privacy-Enhancing Design (as described supra) are required.

Additionally, law needs to change. In my view, data protection law needs to see UX design as part of the data protection ensemble and regulate data protection design. By "regulate" I mean mandating standards to be followed, best practices and accountability measures. Regarding compliance and accountability, as I proposed above, "Privacy Design Policies" — or analogous documents — should be present in every website or web, the same way Privacy Policies currently are; data protection designers should be present in every company as well, the same way data protection lawyers currently are).

As you might have noticed (especially in case you have a legal background), I am purposefully refraining from using terms such as “data subject” and “controller” or “data processor” and using “user” and “organization” instead. This is to facilitate the joint communication between legal specialists and UX designers.

I will continue writing about Data Protection Design and Privacy-Enhancing Design in the next posts, so stay tuned.

JOIN THE CONVERSATION

Building a new discipline and revolutionizing data protection law is not a mission for a single person. I invite everyone interested to join the conversation and this “manifesto.” The key message here is that UX designers need to be on the data protection boat.

You can start by reading my previous posts about the topic here at UX Collective. These posts will give you the foundation on the most relevant issues and challenges that motivate me to propose Data Protection Design as a discipline and Privacy-Enhancing Design as a UX design framework.

If you have a little bit more time available, you can read my academic articles on Deceptive Design in Data Protection and on Transparency-by-Design, which are part of my ongoing Ph.D. at Tel Aviv University (supervised by Prof. Michael Birnhack and Prof. Eran Toch, to whom I am extremely grateful). They are publicly available on SSRN and there you can have the full theoretical background of the topics I am discussing here.

There is a lot of new research coming out in the topic of deceptive design, especially through the Human-Computer Interaction perspective, but also from legal scholars. You can find them online on public repositories and on the SSRN.

The European Union recently published the “Behavioural study on unfair commercial practices in the digital environment. Dark patterns and manipulative personalisation: final report.” There they discuss deceptive design extensively (on pages 32–33 they quote and use my proposed taxonomy for deceptive design in personal data collection), signaling that not only data protection researchers are realizing the need to tackle unfair design, but also lawmakers (which is both optimistic and exciting). I am optimistic and I think change is coming. I invite you to be part of the change.

✅ Before you go:

See you next week. All the best, Luiza Jarovsky

--

--

CEO of Implement Privacy, LinkedIn Top Voice, Ph.D. Researcher, Author of Luiza's Newsletter, Host of Luiza's Podcast, Speaker, Latina, Polyglot & Mother of 3.