Conversational AI Bots with Emotions

Rajkumar Bhojan
Chatbots Life
Published in
5 min readJun 21, 2020

--

Image Courtesy : ucan-courses.com

Can we add some Emotions to Bots?

According to the collins dictionary, “An emotion is a feeling such as happiness, love, fear, anger, or hatred, which can be caused by the situation that you are in or the people you are with.” So, when I speak with my Robot, I expect good responses with emotions based on the conversation that happened between me and the Bot. To achieve this, either we should create an Artificial Brain or identify and respond with appropriate words. Artificial brain research is going on in most of the world-famous AI labs. The later one can be achieved using NLU and NLP.

In our research journey, we are trying to build a Bot to understand human emotions based on the sentences used. We may argue that a bot is, after all, a machine and cannot absorb our emotions, but it depends on how capability we build into it. When a human is talking to humans, he understands not only through words, he also understands his facial expressions, body gestures, tone, and pitch of the voice. Even some of the conversation happens without any language words. As such, that is not happening with Bots. Let us see with some of the examples with existing popular bots. These are all sample words used for testing purposes.

Me: Siri, “I am so happy to tell you that I passed my exams”
Siri: “I don’t know what that means, If you like, I can search the web for …”

Me: Siri, “Last year, I lost my loving dog”
Siri: I don’t know how to respond to that.

Me: Siri, “I failed in my exams”
Siri: I don’t know what you mean by ‘I failed in my exams’. How about a web search for it?

Me: Siri, “I missed my flight”
Siri: I found this on the web (a few website suggestions)

Me: Hey Google, “I am so happy to tell you that I passed my exams”
Google: “I am sure you will go great. Let me know if there is anything I can do to help”

Me: Hey Google “Last year I lost my loving dog”
Google: Sorry, I don’t understand.

Me: Hey Google, “I failed in my exams”
Google: “I am sure you will go great. Let me know if there is anything I can do to help” :)

Me: Hey Google, “I missed my flight”
Google: “On the website USAToday.com, they say, if you think you going to miss a flight, experts say…”

All the above examples give us a type of raw information stored somewhere in the database and getting retrieved. I could not find any emotional touch in the response.

Trending Bot Articles:

1. 8 Proven Ways to Use Chatbots for Marketing (with Real Examples)

2. How to Use Texthero to Prepare a Text-based Dataset for Your NLP Project

3. 5 Top Tips For Human-Centred Chatbot Design

4. Chatbot Conference Online

Rosalind Picard, Director of Affective Computing Research, Massachusetts Institute of Technology, says “Even your dog knows when you’re getting frustrated with it. Siri doesn’t yet have the intelligence of a dog” [1]. Researchers in NLP describe that emotion recognition can be used in call centers, where the goal is to detect the emotional state of the caller and provide feedback for the quality of the service.

The data in Human conversation is highly unstructured. When human talks with another human, he understands the emotions and feelings whereas the machine does not understand the emotions.

The AI chatbots become extremely popular in delivering good customer service. There are a few questions that keep popping up in the customer service environment.

If a customer is sad, does Bot make him happy by saying some motivating words?
If a customer is angry, does Bot sense (NLU) that he is angry and make him calm down?
If a customer is sharing his happy information, does Bot understand and give him compliments?
If a customer is not well, does Bot call his physician and inform him to talk to him?
If a customer is dejected, does Bot call his motivator and give him some motivation?
If a customer feels bored, does Bot give some tips and anecdotes to overcome the boredom?
The list goes on and on.

Some researchers argue that a machine does not have to be human-like to engaging, in some cases, human-like chatbots have created more frustration when engaging with users.

Soon, chatbots may be the preferred user interface of many of the activities performing through a mobile app or a webpage or a dedicated application. The success factor for chatbots is how well they can continue conversational flows while providing useful output. How are we going to add emotional words to a Bot, so that Bot can respond with some supporting words? After all, people/customers don’t need just information but also need some supporting words from the Robots.

A pilot program has done with sample data using Dialogflow, GCP (Google Cloud Platform), NLU, and NLP. Please find it in the attached Video.

In this conversation, the human (me) is changing the voice based on the statements used. As of now (Jun 2020), Google Assistance voice is monotonous throughout the dialog irrespective of the context of the dialog. Hope, soon Bots will use voice modulations based on the emotions and context of the conversations.

[1] https://cacm.acm.org/magazines/2018/4/226375-artificial-emotional-intelligence/fulltext

Don’t forget to give us your 👏 !

--

--