BOOK DEMO
FREE TRIAL
Apr 21, 2020
15 min

Make your conversational AI detect emotions using sentiment analysis

written by: Sylwia Jagiela

Curious about whether your customer is satisfied with your virtual assistant or not? Sentiment analysis is the branch of machine learning that tries to decode the sentiment, the positive, negative or even more complex emotions behind what people say.  

Let’s take an example of a travel agency which incorporates a conversational AI in their customer service process. There are a lot of different accommodation options, but it can still be difficult to fulfill all the wishes your clients may have. What if someone wants to stay in a place surrounded by nature but would also like to spend evenings in the city? Or if the weather should be warm enough for swimming in the ocean but not too hot for hiking? What if the client has high expectations of the hotel’s standards but doesn’t want it to be too expensive?  

If your AI can’t find a place that meets all the criteriayour client may feel frustrated and leave complaints. With sentiment analysis, your AI will be able to recognize this pattern and react accordingly. 

The goal of sentiment analysis is to search through the text looking for patterns which may indicate the user’s feelings. As a result, the text is classified as being of positive, negative or neutral sentiment. 

With Cognigy.AIyou can avoid your clients being frustrated with the use of sentiment analysis as one of the Custom ModulesYou can adapt your virtual agent’s response strategy depending on the sentiment last received by the user. 

How to use it 

In order to employ Cognigy.AI’s sentiment analysis in your virtual agent, use the sentiment node. 

The results of the analysis will appear in either the Context or Input Object. 

 

 

Now you can use this new information to make the conversation more natural. A best practice is to make the verdict of the analysis a condition of the if-node.  

 

Then, you can choose how to answer if the condition is fulfilled and when it’s not.