With Cognigy.AI v4.45, we are thrilled to announce the open beta launch of our brand-new Generative AI features, as demonstrated in our latest webinar.
In recent months, Large Language Models (LLM) have revolutionized AI-powered text understanding and generation, but they alone cannot be used in customer service automation due to critical limitations. At Cognigy, we aim to help bot builders and conversation designers benefit from the tremendous value that Generative AI offers while mitigating its risks and constraints by using Cognigy.AI as an orchestration layer.
Our platform natively integrates with OpenAI and Azure OpenAI’s GPT models on a “bring your own API key” basis. Generative AI features can be accessed by configuring a connection with the chosen vendor in Agent Settings. These new capabilities will augment enterprise users in two ways:
- Enable faster and easier bot building
- Elevate conversational experiences
🎯 Three steps to get started with Cognigy's Generative AI Features1. Watch our webinar to get a bird's eye view of LLMs for customer service transformation
2. Sign up for a Cognigy.AI Trial
3. Check out our documentation for a step-by-step guide on how to set it up
A Shortcut to Building High-Performing Virtual Agents
With the ability to churn out creative and contextually relevant outputs in seconds, LLMs provide a powerful tool to produce virtual agent resources like NLU models, entity collections, and execution flows at lightning speed.
This is not to say that they can or should be used 100% autonomously, but rather deliver a solid foundation that bot builders can refine and optimize. As such, the key value here is reduced time-to-value and increased productivity without compromising the quality and control over conversation design.
1. Training Data Generation
With Generative AI in the Cognigy interface, you can quickly create training sentences for a new Intent by entering a brief description in the “Create Intent” panel. Likewise, you can add new sentences to any existing Intent with just one click using the “Generate Sentences” button.
2. Lexicon Generation
A Lexicon is a collection of domain-specific key phrases, like names or codes, which help your virtual agent better understand customer inputs. With the new Generative AI functionality directly embedded in the “New Lexicon” panel, you can instantly create a list of lexicon entries for a chosen topic, along with relevant synonyms, eliminating the need for manual curation and input.
3. Flow Generation (Alpha feature)
If you think Generative AI is only about creative text generation, think again. With Cognigy’s new AI-assisted Flow Generation, you can leverage LLM to immediately create an execution flow from scratch, including the required logic and output such as Say Nodes, Question Nodes with entity validation, Logic Nodes, and even Code Nodes with full access to the Cognigy Context.
This can be done in two ways:
- By providing a short description of what the flow should do
- By using an example transcript, thus allowing your virtual agent to replicate successful conversation patterns quickly.
Conversational Experiences with a Boost
Beyond bot building, Generative AI can help you bring human-like and engaging conversations to life. Cognigy.AI provides an orchestration platform that lets you harness the incredible language flexibility of LLM within the confines of your business use cases and digital ecosystem.
As such, your virtual workforce can deliver superior service experiences without deviating from the tasks at hand, ensuring customer queries are successfully resolved. Let’s explore three key features that help you achieve just that.
1. AI-Enhanced Outputs
Generative AI oftentimes produces outputs that are unpredictable and incalculable, making it a challenge to apply in customer service. The AI-Enhanced Output feature lets you transform rigid scripted responses into dynamic dialogues that adapt to the customer context while preserving information accuracy and specificity. This capability is available for Say, Question, and Optional Question Nodes.
You can enhance bot outputs in two ways:
- Based on Custom Input using CognigyScript
- Based on previous user inputs
With Custom Input, you have granular control over the specific information that should be reflected in the bot output using the Context Objects saved in Cognigy.AI
2. Complete Text Node
The Complete Text Node allows you to fully define your own GPT Prompt and apply Generative AI to conversation design in any way you desire. This feature gives you direct access to GPT-3 to take advantage of all that LLM technology has to offer, such as summarization, empathetic reactions, and enhanced reasoning.
While the Complete Text Node provides greater flexibility compared to AI-Enhanced Outputs, it also requires more attention during prompt design to ensure high levels of control over bot responses.
3. Extended NLU Pipeline
By combining Conversational AI with LLM, you can achieve a new level of language understanding and entity extraction, allowing customers to express themselves freely.
Let’s say the virtual agent needs to confirm the number of people for a flight booking, and the user response is “My husband, two kids, and I,” a typical NLU engine might default to two given the number included in the input. However, Generative AI can accurately extract the total number of four.
Final Thoughts: The Future of Conversation Design
Many might wonder if Generative AI will render the role of conversation designers obsolete. While Generative AI can automate many manual design processes and improve conversational experiences, the quality of its outputs largely depends on the prompt input. Bot builders and conversation designers therefore will become prompt engineers, playing a critical role in crafting precise rules and directives to optimize Generative AI results and keep virtual agents on task.
Other Improvements for Cognigy.AI
Cognigy Virtual Agents
- Improved by adding event signature JWS validation and additional tenant id endpoint 8x8 settings
- Improved by mapping Cognigy.AI Quick reply payload to 8x8 native adaptive card
- Improved by ignoring "echo" messages from 8x8 for both endpoint and handover 8x8 webhook
- Improved by enabling users to download an Intent Trainer Export from the Task Menu
- Improved by changing the calculation of the ratio of understood messages to not consider null values
- Improved by changing the logic to calculate the handover rate
Cognigy Live Agent
- The administrator will be able to set the configuration of the notifications and will overwrite what has
been chosen by the agent
- Adds the ability to add hyperlink clickable links in the agent reply
- Messages from an agent will be trimmed for leading and/or trailing whitespace characters to avoid the
text to appear boxed
Cognigy Voice Gateway
- Call recording node. Implement the recording node to provide the possibility of recording
- Added the call complete event is sent to flow
- Generic Node - Send Metadata is implemented
- User No Input Node - no user input retries: limiting the max values to 999
For further information, check out our complete Release Notes here.