Cognigy.AI v4.63 features substantial upgrades across the board, including enhanced search experiences with Source Tagging in Knowledge AI, native gpt-3.5-turbo-instruct support, and a more streamlined setup of voice-based Agent Assist.
Precise Knowledge Search Using Source Tagging
In Cognigy.AI, a Knowledge Store is a repository comprising multiple Knowledge Sources. These sources may contain overlapping types of information but differ in detail based on specific variables like product categories, audience groups, and more.
Taking the example of a telecommunications provider, the Knowledge Store might contain user manuals for different router models. Each manual has troubleshooting tips unique to the model and technology (like cable or fiber optics). When an AI Agent is tasked with retrieving information to answer a customer query, the presence of similar information across multiple sources can lead to confusion.
The new Source Tagging feature aims to resolve this issue. It involves categorizing the different Knowledge Sources into distinct content groups using tags. These tags act as identifiers that provide context to the information.
When a customer query is received, the AI Agent can use the tags to filter the search scope and only extract information from the tagged sources that are relevant to the query. Narrowing down the search to the most pertinent sources enhances precision and ensures that the retrieved answer is tailored to the specific question.
Source Tags can be combined with CognigyScript to enable dynamic filtering of search scope during real-time conversations.
Native gpt-3.5-turbo-instruct Integration
The new version also introduces native support for the gpt-3.5-turbo-instruct model from OpenAI and Azure OpenAI. The model has been introduced to replace the existing Instruct model and several other text-based models.
While GPT-3.5 Turbo is designed to be a chat model, gpt-3.5-turbo-instruct is more task-oriented and excels at direct question answering and executing specific instructions. As such, it provides succinct, less chatty responses and is ideal for scenarios that demand accuracy and clarity. The pricing of gpt-3.5-turbo-instruct is in line with other GPT-3.5 Turbo models with 4K context, making it an affordable option for precise and efficient AI interactions.
Streamlined Voice-Enabled Agent Assist Setup
Cognigy.AI v4.63 further simplifies the configuration of the voice-activated Agent Assist Workspace with new features:
- A new Agent Assist section in the Transfer Node (Dial), removing the need for a separate Code Node.
- Agent Assist Voice Endpoint to handle transcription, avoiding manual Webhook Transformer configurations. Simply insert the Endpoint URL into the Transcription Webhook field of the Transfer Node to enable transcription.
Other Improvements for Cognigy.AI
Cognigy Virtual Agents
- Added the Glossary Input ID field to the Endpoint settings for the DeepL Translate Pro provider
- Added feedback in case LLM is not configured correctly within the Knowledge AI Wizard
- Added the ability to select existing connections in the Connection Configuration section within the Knowledge AI wizard
- Added the Agent Assist Workspace link for the 8x8 handover provider
- Added the Agent Assist Workspace link for the Genesys Cloud Open Messaging handover provider
- Removed the Calls dashboard
- Added the limit and sort filters to the detailed view of the Label Summary chart
- Improved by adding proper status codes for the Messages API
- Added the capability to search for canned responses by shortcode or content, and allowed the use of multiple words for searching
Cognigy Voice Gateway
- Permitted top-level FQDN as an outbound network address for SIP Gateways of a carrier in the Voice Gateway Self-Service Portal
- Introduced a new Endpoint,
Agent Assist Voice, which enables a voice-based experience with your Virtual Agents
- Improved the Transfer Node type
Dialwith Agent Assist capabilities
For further information, check out our complete Release Notes here.