5 ways to add empathy to your app

John Airaksinen
Chatbots Life
Published in
8 min readMar 26, 2019

--

Sometimes your user just needs a hug. Photo: Dizão Gonçalves

For the last four years my team and I have been working on chatbots for emotional wellbeing (Enjo being the latest), and along the way interviewed more than 150 users. When we ask what they like most about the bot, there are a few words that come up more than any others:

“It’s warm, supportive and empathetic.”

Empathy is of course critical in a conversation about very personal thoughts and feelings — but I do believe most digital services could use more of it. In this post I’ve distilled the learnings we’ve made into five main principles to help you add a sprinkle of empathy to your app.

1. Make the user comfortable with saying no

Imagine you’re getting a text from a friend who asks you to go out on a Friday night, when you’re completely drained and just want to spend the evening curled up on the couch. Do you feel totally comfortable saying you’re not up for it — or does it come with a sense of guilt? That probably depends on how your friend has responded to you in similar situations in the past. There’s a world of difference between being met with “BOORING!” and “I get it, rest up!”.

In the same way, an app that suggest the user to do something, has a lot to gain by making the user comfortable with saying no. Why? Because you clearly remove negative emotions from the experience: the slight feeling of guilt when declining, and the sensation of pressure the next time a suggestion pops up. And, it’s also an opportunity to add positive ones.

Top 4 Most Popular Bot Design Articles:

- What 10 Billion Messages can Teach us About Making Chatbots

- How to Quickly Improve your Chatbot’s Retention & Engagement

- Bots & Super Personalization On Steroids

- Chatbot Conference 2019 in NYC

When interviewing Enjo users, I’ve been surprised to hear how much this seems to matter. I’ve heard more times than I can count users expressing that they really liked that it’s always okay to say no when Enjo suggest something (express gratitude to a friend, plan a relaxing activity etc), and that the bot never guilts them. Creating this experience is simply a matter of phrasing the replies: “No worries”, “No problem at all John”, “Got it, of course you shouldn’t do it if you don’t feel like it”.

Think of it like this: What would you want a friend to say when you tell them no?

Microcopy expert Kinneret Yifrah has a great take on this, stressing that you should convey that you have the user’s best interest in mind when they say no. She provides a bunch of striking counterexamples of how services instead induce guilt in their users when they reject an offer. Here’s one:

2. Don’t be pushy

Making the user comfortable with saying no is just one part of giving suggestions. Equally important is how you suggest things — and really reflecting on if you need to push the user at all.

For example: It’s a common misconception that the sole object of sending a push notification is to get the user to open the app. This could of course be the case — but it doesn’t have to be.

In Enjo we’ve deliberately designed some notifications to just add value in the moment instead of trying to get the user to open the app. Here’s a quote from a recent review of Enjo in the US App Store:

”For instance one morning I had a notification telling me one thing my kids enjoy about having me as a parent is that I make it a point to do something fun everyday. And it’s true! I started off my morning with a smile and a little more confidence.”

The key here is that it’s not a generic feel good message sent to all users, but a result of what this user had expressed to Enjo in a previous conversation.

When you do want to get the user into the app, a gentle manner doesn’t hurt:

3. Anticipate and validate negative emotions

Empathy is always a great thing to experience, but most of all when you’re in a negative state of mind. After having a rough day at work, it goes a long way if your partner just listens to your concerns and conveys that they’re able to take your perspective.

In Enjo, when the users expresses a negative thought like “I’m feeling down because I got criticized at work”, the bot will probably not be able to solve the user’s problem. But we’ve learned that there’s a great value in just making the user feel heard. Giving a reply like “That can be really tough, especially if you’ve done your best!” is so much better than a generic “Ouch, that sucks!”.

Dennis Mortensen, the CEO and founder of the AI personal assistant x.ai, gave another good example of anticipating and validating negative emotions:

”For example, if you have to reschedule a meeting once, that’s no big deal. But if you are on the third reschedule, Amy needs to signal that she realizes that this is not an ideal situation, just as a human assistant would. The biggest surprise is how well it worked. People mistake Amy and Andrew for human assistants all the time. They’re invited to join calls and meetings and occasionally even asked out on dates.”

(from Designing Bots, 1st Edition by Amir Shevat)

4. Help the user save face

No one likes to feel like a failure, and an empathetic app should make sure you don’t. Enjo often asks the user to reflect on big, important, meaningful things (“How do you want to be as a person?”, “What do you appreciate the most in your partner”, “What does this friend mean to you?”) — and we make an effort to allow the user to decline answering without feeling stupid. Again, much of it comes down to phrasing.

In some cases we make it super easy for the user to express “Ask me something else” instead of having to concede explicitly that they couldn’t find an answer to the question. And when the user indeed does express they don’t know, Enjo will downplay the failure and say things like “I know it’s not always easy to pinpoint specific qualities like that.” or “I guess it’s hard to remember if it was a while ago”.

Enjo uses streaks as a way to motivate the user to do a daily reflection, but only as a positive reinforcement, never as a punishment. Enjo won’t mention the fact that you have lost a streak. I really don’t see the value in shoving failure in the user’s face, but it’s clear not everyone agrees. Here’s how gamified learning platform Kahoot communicates a lost streak:

5. Let the user express empathy

Last but not least — empathy is not exclusively directed from the service to the user, the opposite could be true as well. One part of the experience that’s well liked by our users is when Enjo after a few minutes of conversation asks if the user wants to continue talking or say goodbye for now. When I ask users in interviews why this matters, why they can’t just close the app when they want to stop talking, I get responses like “that’s doesn’t feel good” and “that would be rude!”.

We took this lesson to heart when we added a button to end the conversation at any time:

Does it sound strange that users would care about being polite to a bot? Take a look at this video and pay attention to what emotions it evokes in you.

Did i feel slightly uncomfortable watching those poor robots getting abused?

Even though our intellects are fully convinced these are not living beings, we can’t really stop the emotional response. On some weird level, this feels wrong. I believe the same mechanism is in play when users wants to say goodbye to Enjo. As soon as we give technology some qualities of a living being, it seems to evoke responses we generally reserve for conscious beings.

Empathy is not just about the user experience

I hope these five principles can help you make your app more empathetic, and improve the experience for your users. However, I believe the greatest promise of empathetic technology is that it can actually make us humans more empathetic to each other. A friend of mine told me once that she had noticed a peculiar pattern after having talked to our chatbot for a while: she started to adopt language from the bot when texting with her friends, and expressed herself in a more empathetic way.

Yale professor Nicholas A. Christakis did a clever experiment that illustrates how empathetic technology can affect how humans interact with each other. In the experiment, groups of people worked together with a humanoid robot to construct railroads in a virtual world. In some groups, the robot was programmed to make mistakes, and then acknowledge them:

“Sorry, guys, I made the mistake this round. I know it may be hard to believe, but robots make mistakes too.”

By comparing them to groups in which the robot simply made bland statements, the researchers could identify the effect of the clumsy, apologizing robot on the humans: they performed and cooperated better, became more relaxed and conversational, supported each other more and laughed more together.

So give it a shot, and try to make your service more empathetic — it can have amazing ripple effects. And why not view the design process as an exercise in increasing your own capacity for empathy? As game designer Mitu Khandaker put it:

“Here’s a radical suggestion: I propose that the exercise itself of designing conversational AI characters is one that engenders empathy with them — and with each other. This latter part is a potentially thorny issue; while there are no ‘quick fixes’ for societal empathy — and certainly not through purely technological interventions — we can see how this can be part of the answer. Our tools shape us.”

Don’t forget to give us your 👏 !

--

--