LLM-Based Intent Reranking and More with Cognigy.AI v4.76

4 min read
Nhu Ho
Authors name: Nhu Ho May 22, 2024

Building on the previous release of the GenAI embedding model option for NLU training, we’re excited to introduce our second native hybrid AI feature to enhance intent recognition accuracy with Cognigy.AI v4.76.

GenAI-Powered Post-Processing of the NLU Pipeline

The new feature operates in the post-processing stage of the NLU pipeline, providing a sophisticated layer of analysis and refinement after the initial intent recognition has been performed.

It is particularly beneficial for large, intricate enterprise intent models with plenty of overlapping intents. In such scenarios, the initial NLU mapping process can struggle to differentiate between closely related intents. And this is where the LLM-powered reranking capability excels.

How It Works

After the NLU identifies the top five intents, the LLM analyzes and adjusts the intent ranking based on various inputs, including intent names and descriptions. This ensures the most relevant intent is accurately mapped, maximizing the overall performance of our AI system.

Example

Consider the following user input: “I lost my Internet password. I want to reset it but can't log into my account now. Can you please help?”

As humans, we immediately grasp the underlying nuances and recognize the user’s end goal is to restore their Wi-Fi or Internet access. However, with multiple similar intents available, traditional NLU might incorrectly map the user intent to “Lost online account password” instead of “Lost Wi-Fi password”.

With Cognigy.AI v4.76, the LLM-based Intent Reranking lets you leverage GenAI to boost contextual understanding and reduce errors in processing ambiguous sentences. See the demo below.

 

For a detailed guide on how to set it up, visit our documentation.

A Recap of Cognigy’s Hybrid AI Approach

This new feature complements the existing pre-processing enhancement, exemplifying Cognigy’s unique hybrid AI approach to deliver superior intent recognition and enterprise-grade performance.

  • Pre-Processing (GenAI Embedding Model): Enhances the NLU model training by embedding richer semantic understanding into the initial intent recognition process. See the previous release blog.
  • Post-Processing (LLM-Based Intent Reranking): Refines and optimizes the recognized intents by leveraging the deep contextual understanding of LLMs, ensuring the final intent selection is accurate.

Other Improvements

Cognigy.AI

  • Added Intent descriptions to the input.nlu.scores object to improve intent ranking
  • Updated the @cognigy/extension-tools package. Now, this package includes the latest features added to the Extension Tools framework
  • Improved the Genesys OM handover provider to separately display end-user and virtual agent messages within Genesys
  • Updated Demo Webchat to version 2.59.2
  • Renamed Agent to Project in the Cognigy.AI UI
  • Improved by adding the possibility to process bot messages while using Genesys OM handover provider. These messages will be injected into the flow. For On-Prem environments: the feature flag GENESYS_CLOUD_OM_HANDLE_BOT_MESSAGE: "true" must be set in the environment variables to enable this functionality

Cognigy Live Agent

  • Improved by adding new activity messages about assignments within the conversation. These messages will appear when the selected team is changed, and when auto-assignment occurs as the conversation is created
  • Improved processing of queued conversations when reassignment logic is enabled for the inbox and the account has a maximum conversation limit enabled
  • Improved by deactivating the agent auto logout by default

Cognigy Voice Gateway

  • Updated vg-core to 0.1.70 adding new Microsoft STT languages
  • Updated the base image to node:18.20-alpine3.19
  • Improved by supporting Azure Speech Services with private endpoints

For further information, check out our complete Release Notes here.

image119-1
image119-1
image119-1