Advanced Prompt Engineering, Usability Enhancements, and More with Cognigy.AI 2025.13 Release

7 min read
Nhu Ho
Authors name: Nhu Ho July 3, 2025
2025.13 Release_Hero

The latest release of Cognigy.AI introduces the new LLM Prompt and Load AI Agent Nodes for developers and enterprise teams seeking greater precision, configurability, and visibility in building AI Agent solutions.

Additionally, it includes various quality-of-life improvements that streamline onboarding, facilitate testing, and enrich Live Agent analytics.

Attaining Ultimate Control Over Agent Prompting

Within Cognigy.AI, the LLM Prompt Node allows you to directly interface with the LLM and precisely configure the system prompt for execution.

What’s New

Previously limited to text generation, the new LLM Prompt Node now integrates advanced functionalities in line with Cognigy’s Agentic AI paradigm, including:

  • Tool Actions: Seamlessly integrate and invoke both native and MCP Tools within your LLM calls.
  • Image Support: Incorporate real-time image analysis and processing into conversational experiences.
  • Conversation Transcript Access: Dynamically utilize conversation history to enable more context-aware responses.

A new Load AI Agent Node additionally allows developers to load an existing AI Agent configuration and persona into the Input or Context object. This facilitates the reuse of AI Agent settings in the LLM Prompt Node, promoting consistency and efficiency in complex Flows.

Why It Matters

Together, these two Nodes provide maximum transparency and control over prompt construction for agent execution. You can replicate the behavior of the AI Agent Node while applying bespoke prompt engineering tailored to your enterprise scenarios.

Deep-Dive Live Agent Analytics with New OData Parameters 

The Live Agent OData endpoint has been augmented with additional parameters across the Label, User (human agents and supervisors), and Conversation collections. New attributes include:

User Activity Metrics:
  • Most recent login timestamp
  • Last sign-in and sign-out timestamps
  • Total number of logins
Conversation-Level Details:
  • Conversation ID per Label
  • The date and time when a conversation is assigned and ended

Impact for Enterprise Teams

These enhancements support more granular, filterable data queries, enabling the creation of richer dashboards and more targeted insights. Whether you're optimizing resource planning or refining support processes, these metrics help to inform decision-making.

Accelerated Knowledge Setup During Agent Creation

Setting up your AI Agents is now even faster and more flexible. Besides selecting an existing Knowledge Store, you can now also upload Knowledge Sources directly from the Agent Creation Wizard. This enhancement enables quick and seamless knowledge integration right within the creation workflow, streamlining configuration.

Broadened Markdown Support to Enhance Testing and UX 

Developers and testers can now experience richer, more accurate output previews with the new Render Markdown option in the Interaction Panel settings. By enabling this feature, you can validate exactly how your AI Agents will render responses that include:

  • Lists and bullet points
  • Hyperlinks and inline formatting
  • Bold and italic text
  • Headers, tables, and footnotes

This capability ensures your conversational UI behaves as expected, improving quality assurance and expediting the iteration process before deployment.

Plus, markdown support is now also available for the Privacy Notice field in Cognigy Webchat, allowing you to customize texts with bold and italic formatting and hyperlinks.

Other Improvements

Cognigy.AI

    • Added support for OpenAI's and Microsoft Azure OpenAI's gpt-4.1, gpt-4.1-mini, and gpt-4.1-nano models, and Mistral AI's mistral-medium model. Additionally, changed the version mapping of Anthropic's and Mistral AI's models to include the *-latest suffix. This suffix ensures that the latest version of the model is used when you select it. The versioning mapping applies to the following models:
      • Anthropic's claude-3-5-sonnet and claude-3-7-sonnet are now listed as claude-3-5-sonnet-latest and claude-3-7-sonnet-latest respectively.
      • Mistral AI's mistral-large-2411, mistral-small-2501, and pixtral-large-2411 are now listed as mistral-large-latest, mistral-small-lates, and pixtral-large-latest respectively. mistral-medium is listed as mistral-medium-latest
  • Added capability to collapse debug messages in the Interaction Panel and copy them to the clipboard
  • Improved stability of the handover database connection and optimized message sending to handover providers by using caching

Cognigy Voice Gateway

  • Improved general stability of Voice Gateway

For further information, check out our complete Release Notes here.

image119-1
image119-1
image119-1