Curious whether your customer is satisfied with your virtual assistant or not? You can now equip your Conversational AI with sentiment analysis to make it detect emotions.
Sentiment analysis is the branch of machine learning that tries to decode the sentiment, the positive, negative or even more complex emotions behind what people say.
Let’s take an example of a travel agency which incorporates a conversational AI in their customer service process. There are a lot of different accommodation options, but it can still be difficult to fulfill all the wishes your clients may have. What if someone wants to stay in a place surrounded by nature but would also like to spend evenings in the city? Or if the weather should be warm enough for swimming in the ocean but not too hot for hiking? What if the client has high expectations of the hotel’s standards but doesn’t want it to be too expensive?
If your AI can’t find a place that meets all the criteria, your client may feel frustrated and leave complaints. With sentiment analysis, your AI will be able to recognize this pattern and react accordingly.
The goal of sentiment analysis is to search through the text looking for patterns which may indicate the user’s feelings. As a result, the text is classified as being of positive, negative or neutral sentiment.
With Cognigy.AI, you can avoid your clients being frustrated with the use of sentiment analysis as one of the Custom Modules. You can adapt your virtual agent’s response strategy depending on the sentiment last received by the user.
In order to employ Cognigy.AI’s sentiment analysis in your virtual agent, use the sentiment node.
The results of the analysis will appear in either the Context or Input Object.
Now you can use this new information to make the conversation more natural. A best practice is to make the verdict of the analysis a condition of the if-node.
Then, you can choose how to answer if the condition is fulfilled and when it’s not.
Stay up to date with the latest Conversational AI tips and news.
© 2021 Cognigy
All rights reserved.