Aleph Alpha Native Integration, Enhanced LLM Model Selection and More with Cognigy.AI v4.65

5 min read
Nhu Ho
Authors name: Nhu Ho December 3, 2023

The latest release of Cognigy.AI features enhanced flexibility in LLM model selection alongside further improvements in Deepgram integration.

Native Aleph Alpha Support for Enterprises

Continuing our commitment to provide enterprise-grade flexibility in LLM deployment, Cognigy.AI v4.65 introduces native support for the Aleph Alpha Luminous model family. This addition expands our range of supported LLMs, which includes models from OpenAI, Azure OpenAI, Anthropic, and Google AI.

Compared to other Generative AI vendors, Aleph Alpha differentiates itself with a Sovereignty AI strategy that caters to enterprise and government applications. It offers customizability, on-premises deployment as well as compliance features like identity and access management, reproducibility, and auditability.

Within Cognigy.AI, the Luminous model family can be utilized for the LLM Prompt Node and Knowledge AI Answer Extraction feature.

Cognigy.AI - Alepha Alpha Integration

Swiftly Adapt to Generative AI Evolution

The landscape of Generative AI is evolving at an unprecedented pace, with continuous advancements and model upgrades becoming the standard. Take OpenAI's GPT-4, introduced in March this year – the GPT-4 suite has already expanded to include six variants, including two newest GPT-4 Turbo models, unveiled just three weeks ago.

To help you stay at the forefront of these developments with minimal effort, Cognigy.AI now enables you to select and utilize any specific model within an LLM family swiftly, using the 'Custom model' parameter during your LLM Resource configuration. For instance, specifying 'gpt-4-1106-preview allows you to harness the advanced capabilities of the newly released GPT-4 Turbo. Once a custom model is added, it will overwrite the default LLM model.

This feature is currently available for OpenAI and Aleph Alpha providers.

Optimized Transcript Quality with Deepgram Smart Formatting

Cognigy.AI v4.65's enhanced integration with Deepgram Speech-to-Text (STT) service optimizes transcript quality for enterprise applications. The new Smart Formatting feature transforms raw transcripts into polished, professional documents by applying advanced formatting to dates, times, currencies, phone numbers, emails, and URLs, enhancing readability and efficiency.

Before Smart Formatting

After Smart Formatting

I woke up at seven twenty am missed eight alarms got twelve messages about the meeting on april first twenty twenty three then did a thirty minute jog ate six slices of toast with fifty grams of butter at eleven elm street

I woke up at 7:20 AM, missed 8 alarms, got 12 messages about the meeting on April 1, 2023. Then did a 30-minute jog, ate 6 slices of toast with 50 grams of butter at 11 Elm Street.

Other Improvements for Cognigy.AI

Cognigy Virtual Agents

Cognigy Insights

  • Updated the tooltip texts in the Overview, Engagement, and NLU Dashboards

For further information, check out our complete Release Notes here.

image119-1
image119-1
image119-1