The newest release of Cognigy.AI lets you harness the powerful combination of LLM and NLU and enjoy maximum freedom in LLM model selection to stay abreast of the GenAI revolution.

Blending NLU and LLM for Optimal Understanding and Multilingual Capabilities

Natural Language Understanding (NLU) and Large Language Models (LLMs) both bring unique advantages to conversational systems.

  • Language Prowess of LLMs: GenAI has been engineered for deep language understanding, context awareness, and nuanced interpretation, enabling them to grasp complex queries better than traditional models that are often limited to basic syntax.
  • NLU's Precision in Structured Automation: NLU excels at structuring inputs and tying them to specific outcomes, ensuring AI Agents are task-focused and effectively handle specific commands or transactions.
  • Synergy in Handling Variability: By combining LLMs and NLU, AI Agents can process a broader spectrum of input styles and linguistic variations. LLMs can tackle less structured and more implicit inputs, while NLU ensures that the identified intents are linked to concrete actions.

Cognigy.AI v4.74 lets you capitalize on this synergy at the click of a button. In the NLU Settings, you can select a GenAI embedding model to expedite Intent training and boost recognition accuracy instantly. Providing native support for OpenAI and Azure OpenAI’s text-embedding-3-large models, this feature is ideal for complex and large-scale projects that require multilingual capabilities.

Note: Utilizing an external GenAI model involves routing data through the designated service provider.

Extensive Model Support for Integrated LLM Services

The latest release also presents a more streamlined setup of your LLM connections, expanding the Custom Model option across ALL supported LLM services (OpenAI, Azure OpenAI, Anthropic, Google, and Aleph Alpha). This allows you to harness any existing and future models introduced by these vendors to stay at the forefront of the GenAI evolution. (Note: For Google Vertex AI, we support only text-bison models).  

  • Flexibility and Choice: Enjoy complete autonomy in selecting the model that best fits your needs and use cases.
  • Future-Proofing: Remain at the cutting edge of GenAI development and swiftly adopt new models within a few clicks.
  • Cost-Efficiency: Optimize for cost with models that suit your performance requirements at the most economical price point.
  • No Vendor Lock-in: Easily switch between providers as needed, adapting to changes in pricing, service levels, or compliance with industry standards.

When adding a new LLM, the redesigned configuration panel lets you specify your preferred model right off the bat.

Cognigy.AI_LLM Resource

Other Improvements for Cognigy.AI

Cognigy Virtual Agents

  • Enhanced by redesigning the Salesforce conversation polling for better stability
  • Expanded the German localization in the product
  • Made the API version field required for Azure OpenAI when creating models
  • Improved by maintaining the connection open when the conversation disconnects for Genesys Cloud Guest Chat

Cognigy Live Agent

  • Added performance improvements to conversations and calculations for the Reports
  • Improved Live Agent performance in high-load scenarios by reducing WebSocket traffic

For further information, check out our complete Release Notes here.

image119-1
image119-1
image119-1