With empathy and compassion for a better user experience?

A new approach to improving user experiences.

IngoWaclawczyk
UX Collective

--

A statue of buddha while meditating, which is the practice of getting more compassionade
Photo by Mattia Faloretti

The worldwide use of the Internet has risen to new record numbers in the past year. According to a recent study by Contentsquare, November 2021, statistics saw a new high of 4.8 billion online user sessions per month worldwide. But is the use of the digital applications also satisfactory for the users? If you look at the following figure, the answer (at least for the important area of ​​e-commerce) is clear: 85% of people are dissatisfied with their online shopping experience.

Online usage is therefore at a high level worldwide and at the same time a clear majority of users are dissatisfied with their experience. This observation leads me to the following hypotheses:

  • Could it be that the finding described above points to a fundamental problem in the communication between humans and (digital) machines?
  • Could it be that this great dissatisfaction among users is also due to the fact that the digital applications are primarily understood and developed by the providers as a technical product — and not as a communicative product?
  • And: Could it be that a new perspective on digitalization could help improve communication between humans and machines and make users happier with their online experience?

A term that comes up again and again when it comes to making something more “human” is “empathy”. In fact, there are some interesting articles on Medium, that refer to the topic of empathy, like “Design has an empathy problem”, “Your empathy map lacks empathy” or “What is empathy mapping?”.

Empathy is great, but…

As a matter of fact, empathy is generally wonderful and can be a source of joy and kindness towards individuals. But: there are also characteristics of empathy that are not very helpful when it comes to successful communication. The American psychologist Paul Bloom has compiled the current state of research in his book “Against Empathy” and given numerous examples of why empathy can be problematic:

  • The person in need of help sees the reflection of his suffering, but no lasting solution.
  • Empathy can lead to emotional stress for the helper (which was clearly observed in the Covid pandemic).
  • Empathy is biased, prejudiced, partial, and can lead to morally questionable decisions.
  • People make decisions based on their own view of things and are influenced by external impulses that are not constant.
  • The decisions are therefore always different. You think you’re acting rationally, but you’re actually acting irrationally.

Paul Bloom’s conclusion is that the negative qualities of empathy outweigh the positive ones. When I first heard this fundamental criticism of empathy (presented by Rasmus Hougaard, Managing Director of Potential Project at the Peter Drucker Forum 2018 in Vienna) I was astonished and asked myself what the alternative could be. The answer of the experts is: compassion.

The alternative to empathy: compassion

Unlike empathy, compassion is about, among other things:

  • A more distant feeling of love, kindness, and concern for others.
  • The person in need of help experiences the opposite of his suffering (e.g. instead of panic: calm or a friendly word when sad).
  • Unlike empathy, compassion does not mean sharing in another’s suffering.
  • Compassion is characterized by a feeling of warmth, interest, and genuine concern for others, and a strong motivation to improve the well-being of others.
  • Compassion-based help is good for oneself and good for others.
  • Compassion is feeling for others, not feeling with others.

The interesting thing is that research results from leading neuroscientists like Tania Singer clearly show that empathy and compassion take place in different regions of the brain and are clearly distinguishable from each other. This differentiation is very important because until now both terms are often used as synonyms that mean the same thing (almost like the terms “UX” and “UI” are used by many as the same thing, but are something completely different. But that is another tiresome topic).

The scientists’ findings are:

  • Empathy occurs in brain regions where pain is also felt.
  • Empathy is often misunderstood as compassion and can lead to burnout.
  • Compassion takes place in brain regions where belonging and love also happen to be felt.
  • Compassion can increase affection for others and can be trained — especially through the practice of meditation.

Even if research on this topic is only just beginning and is far from completed, researchers come to this statement: “Taken together, these results underline the important distinction between empathy and compassion, both on psychological and neurological level.” For the future there is more research going ahead on that topic. The researchers hope to find out more about the various aspects of compassion because then this knowledge could “help to assure an effective education fostering subjective wellbeing, adpative emotion-regulation, meaningful relationships and human prosocality.”

The experiment: can empathy and/or compassion improve user experience?

Empathy and compassion seem to have a major impact on the interaction between people. Do they also influence the interaction between people and machines, i.e. the user experience? We approached the answers to this question with an interactive experiment as part of the 20th UX Meetup Metropolis Ruhr of the professional association of German usability and user experience experts (German UPA) in January 2020.

The basic question for the experiment was: Could it be that an automated dialogue expressing empathy or compassion is more usable than a purely functional dialogue?

As a test object, we created a chatbot that users can use to reserve and book a conference room. We have developed three variants for this:

  • A “Technical” (using a functional language)
  • An “Empath” (a kind of “assistant”)
  • A “Compassionate” (with a very relaxed language)

To carry out the experiment, the approximately 50 participants in the event were divided into three test groups. Each group tested the three chatbots in a different order to get comparable results.

A unique scenario was specified in the experiment so that the participants could complete the same task with each of the three chatbots. A persona, the distinct context of use and the task were presented to the participants. The task was: book a conference room for a specific date, for a specific number of guests and for a specific purpose.

In the scenario, the persona uses the chatbot for the first time. A time frame of 15 minutes was provided for testing the three chatbots. The use of the chatbots was accompanied quantitatively and by a qualitative follow-up survey. The results were evaluated in real time and discussed with the participants as part of the experiment.

The results show: yes, they can…

Interesting result of the survey: 91% of the participants noticed differences when using the three chatbots. When asked what differences they noticed, participants responded, among other things:

  • The technical bot was functional and to-the-point, but was also perceived as unsympathetic and rude.
  • The “assistant” gave me the best information.
  • The compassionate bot, unlike the others, did the best job of guiding me through the process, but was also found to be overly friendly. In addition, it takes a long time to read the texts.

When asked about their satisfaction with use, the participants answered the following:

  • 40% of the participants found the technical chatbot the easiest to use
  • 36% of the participants found the empathetic chatbot the easiest to use
  • 24% of participants found the compassionate chatbot the easiest to use

In total, almost two-thirds of the users found the two new chatbots the easiest to use. The results of the experiment can be summarized as follows:

The “technical” chatbot was rated by the participants as efficient, fast and to-the-point. On the other hand, he was also perceived as rude and unsympathetic.

The “empathetic” chatbot leads the participants to their destination the fastest and has give them the best information.

The “compassionate” chatbot was perceived as the most polite, nicest and best guided throughout the process. On the other hand, the lyrics were perceived as too long and exaggerated.

Summary

Empathy and compassion are obviously clearly distinctively perceived by users and can actually lead to a better user experience (“Guided me well through the process, felt like holding my hand, gave good information”).

With regard to the initial question (“Could it be that an automated dialogue that expresses empathy or compassion is better usable than a purely functional dialogue?”), the results of the experiment show: with empathy and compassion, there are further forms of communication in addition to a functional direction — methods are available to achieve a better user experience, depending on the context and the specific task.

A concrete application example in which the findings from the experiment could be applied: The naming of complex online processes (e.g. forms) where the speed of use is not important, but rather the entire completion of the task. In any case, it will be interesting to see in the future whether empathy and compassion can establish themselves as methods of communication in digital applications and whether the value of user satisfaction will move into a more positive direction.

References:

Boston Review: Against EmpathyMost people see the benefits of empathy as too obvious to require justification, by Paul Bloom

Compassion Bridging Practice and Science, by Tania Singer.
About: What is the difference between empathy and compassion? Is it possible to train compassion? Can it be measured? How useful is compassion training in schools, clinical settings, and end-of-life care? Can the brain be transformed through mental training?

Empathy and compassion — by Tania Singer and Olga Klimecki

--

--