Transparency by design: transforming the web

Luiza Jarovsky
UX Collective
Published in
7 min readMay 26, 2022

--

Three user experience mockups of mobile interfaces, one blue, one purple and one green
Photo by Hal Gatewood on Unsplash

Today’s topic is Transparency by Design, which is the approach I proposed in my last research article to solve some unfair design practices online, such as dark patterns in data protection. After discussing the cognitive biases that are exploited by dark patterns, in this post, I want to highlight the role of UX designers in helping users to learn about what happens with their data and in empowering them to navigate the web with more privacy awareness.

Transparency by Design basically proposes that compliance with transparency rules (in the data protection context, such as the ones established in the GDPR) should happen at all levels of UX design. In the full article, I showed why failing to support users with accessible information, privacy-supportive design and meaningful choices creates an unfair online environment, exacerbating informational vulnerabilities. In this post, I would like to focus on how privacy and data protection concerns can be thought and dealt through UX design.

A starting point for this discussion — and the elephant in the room — are privacy policies. Currently they are the main channel used by organization to comply with their transparency obligations. However, they have always bothered me because they are “fake.” They pretend to be tools to empower users, when they are actually documents written by lawyers, for lawyers, aiming mostly at avoiding liability in cases of fines and lawsuits. Nevertheless, they try to use “friendly” language and they make clear that it is the responsibility of the user to read it and to come and check further in case of updates. It makes no sense and it always made me angry. An internet user might navigate through dozens of websites or apps in a single day. Is he or she expected to read all these privacy policies — and even check for updates? In addition to this nonsensical expectation (that users have to read privacy policies), even if they wanted to embark in this mission, there are so many cognitive biases involved on the process that they would probably not succeed (I wrote an academic article about the shortcomings of privacy policies, you can read it here). There has to be another solution, and here is where UX design enters the game.

UX design experts Don Norman and Jakob Nielsen said that:

“The first requirement for an exemplary user experience is to meet the exact needs of the customer, without fuss or bother. Next comes simplicity and elegance that produce products that are a joy to own, a joy to use. True user experience goes far beyond giving customers what they say they want, or providing checklist features. In order to achieve high-quality user experience in a company’s offerings there must be a seamless merging of the services of multiple disciplines, including engineering, marketing, graphical and industrial design, and interface design.”

Transparency by Design, therefore, proposes that the UX design of a product or service should be a vehicle to transmit data protection information and help mitigate informational vulnerabilities. When planning and executing their product UX, organizations should take into consideration the data practices that occur in that context, and aim at generating awareness about these practices. I will exemplify:

Screenshot of the Facebook interface that appears when a user will post on his or her profile. The name and picture of Luiza Jarovsky appear on top, then the text "What's on your mind," followed by options to edit or add other features to the post (such as image, location etc).

Above is a screenshot of the interface being shown when a user wants to post something on Facebook. The UX design presents the visibility menu right below the profile name, which is where the user can choose between posting publicly, to “friends”, “friends except …,” “specific friends” or “only me.” Showing this menu every time the user will post a new content is a positive data protection feature, as well as a better practice than concealing it (and expecting the user to go to the general settings page and from there choose a default configuration).

However, it is still not ideal, as it makes several assumptions on the users’ previous knowledge on data practices which might not be true and might generate discomfort and privacy harm. For example, the current design assumes that users understand that:

  1. clicking on the audience menu will give them audience options for the post, and they might benefit from tailoring the visibility of certain types of content (it is not mandatory to confirm a visibility choice before proceeding to post, so users might not understand that there is the option to change it);
  2. posting on Facebook means that, as a digital and fungible asset, the content now is out of the user’s control and can potentially be re-uploaded indefinitely and on different platforms, for different uses;
  3. posting publicly means that strangers might see that content and interact with it;
  4. posting publicly means that if someone comments on the post, some of the commenter’s Facebook “friends” will be notified on their feed about this interaction and might want to interact as well, generating a cascade of potentially unwanted interactions;
  5. posting to “friends” means that anyone from the user’s Facebook’s “friends” list might see the content, even long-time ex-coworkers, distant family members or estranged acquaintances;
  6. posting to “friends” means that if someone comments on the post, some of the commenter’s “friends” that are also “friends” with the original author will be notified on their feed about this interaction and might want to interact as well, generating a cascade of interactions;
  7. a Facebook profile is searchable on search engines such as Google (unless selected otherwise in the privacy settings). If users post something publicly, this content can be indexed and found through search engines when someone looks up their name;
  8. the content posted on Facebook will be used to tailor advertising to the user;
  9. the content posted o Facebook will be used by third party apps whose accounts were connected to Facebook by the user.

These assumptions are all embedded in the UX design choices shown on the image above, meaning that Facebook UX designers have assumed that users are aware of items 1–9 above when they post a certain content. If these assumptions were true, then the current UX design would be ideal, as it would be suitable for the skills and background knowledge of the users it is catering for. However, studies have shown that users lack essential knowledge about data practices. Designers and developers are particularly specialized in the functioning of the product or services of the organization that employs them and might forget that the regular user does not have a fraction of that background knowledge, and therefore will be uninformed and vulnerable when trying to navigate a certain interface.

Transparency by Design proposes that no data protection background knowledge should be expected from users. UX designers should have data protection illiteracy in mind and build interfaces that can cater for audiences that are vulnerable to privacy harm. Going back to Facebook’s interface above, examples of changes in the UX interface that would be aligned with Transparency by Design are:

  1. ask users to manually select the audience of the post before every post;
  2. after the user has selected the desired audience, show on average how many people will see the post and ask whether the user wants to change the desired audience;
  3. warn the user before posting that other people could download or screenshot the content and repost it on Facebook or other platforms, therefore it might be impossible to retrieve control over its availability;
  4. warn users that even posting to friends only, distant or estranged friends might interact with the post and generate unwanted attention;
  5. ask the user if he or she wants to allow others to comment or share the post;
  6. add a button before posting where users could receive more information about audiences, how their data is being used, and how information is shared with connected apps and to personalize advertising;
  7. actively ask users if they want their profile to be publicly indexed by search engines;
  8. inform users before posting that the content posted will be used by advertisers and third party apps;
  9. provide an easily accessible 24/7 channel (i.e. an intelligent chat bot) to offer tech support to users on data protection issues. In case the chat bot cannot provide a satisfactory answer, provide a feedback form that should be answered in due time.

These are a few suggestions from a researcher that is not aware of the proprietary structure of Facebook’s UX design or decision-making processes. My argument is that data protection and safeguarding users’ privacy should be two more disciplines to be considered by UX designers during the elaboration of their product or service’s interface. The main guideline to be followed by UX designers is a simple one: no prior data protection knowledge should be expected from users, all data collection points should be transparent and clear, and the explanation should be contextual and as close to the collection point as possible.

In additional to the general guideline above, market players and industry organizations could develop specific sectorial guidelines on how to elaborate these contextual and UX-embedded privacy notes (or visceral notices as Ryan Calo has named them), having as the main focus reducing users’ informational vulnerabilities and increasing users’ data protection literacy. More research is needed at this point to measure and propose adequate forms of UX-embedded privacy notices for different online scenarios and audiences.

There is a lot to unpack (my full article has 58 pages, you can read it for free here) and I hope to talk more about it in future posts, especially regarding UX designers’ role in promoting privacy. I personally believe that the web can be a more empowering place, where people can have their vulnerabilities protected, their privacy respected and their autonomy valued. I think this transformation is possible and I invite you to join the conversation.

Did you enjoy reading this post? I would love to hear your feedback about the content, as well as additional topics that you would like to see covered. This is an ongoing project and new ideas are always welcome.

✅ Before you go:

See you next week. All the best, Luiza Jarovsky

--

--

CEO of Implement Privacy, LinkedIn Top Voice, Ph.D. Researcher, Author of Luiza's Newsletter, Host of Luiza's Podcast, Speaker, Latina, Polyglot & Mother of 3.