Nhu Ho
Authors name: Nhu Ho July 19, 2023
v4.55_Hero Image(2)

From extended LLM integration and improved Live Agent Canned Responses to turnkey entity recognition using pattern-based Question Types, let’s dive into our top highlights from Cognigy.AI v4.55.

Harness Google PaLM 2 for Customer Service Automation

At Cognigy, we strive to help enterprises quickly pivot and capitalize on best-of-market Generative AI solutions, given the unprecedented development speed in this space. To this end, we are thrilled to introduce our newest LLM integration with text-bison-001, a foundation model from Google Vertex AI’s PaLM 2 family.

PaLM 2 surpasses its predecessor with improved reasoning, coding, and multilingual capabilities and comes in multiple sizes. Among them, Bison is positioned as the best-value model in terms of capabilities and cost. The text-bison model, for instance, is priced at $0.0010 per 1000 characters or 250 tokens. With a maximum of 8,192 input tokens, it is fine-tuned to execute various language tasks by following natural language instructions. These include:

  • Classification
  • Sentiment Analysis
  • Entity Extraction
  • Extractive Question Answering
  • Summarization
  • Re-writing text
  • Concept ideation

In addition, developed on the Vertex AI machine learning platform, PaLM 2 adheres to Google’s Data Governance and Responsible AI framework. It incorporates safety filters and attributes to manage and mitigate harmful responses alongside robust practices to ensure data security and control, making it a strong contender for enterprise use cases.

Within Cognigy.AI, you can now configure text-bison-001 in the LLM Resource at the touch of a button and leverage it for real-time user conversations via the LLM Prompt Node.

 

 

Improve Entity Extraction with Pattern-Based Question Types

Cognigy.AI v4.55 introduces nine new pattern-based Question Types to the Question Node. These pre-defined Types are specifically designed to facilitate entity mapping in a variety of customer service scenarios. From License Plate (DE) to IBAN, Bank Identifier Code, Social Security Number (US), IP Address (IPv4), Phone Number, and Credit Card, these Question Types ensure accurate capture of essential information for further data processing and parsing with enterprise backend systems.

Question Types

Add Variables to Canned Responses for Personalization

Another noteworthy improvement in this release is the ability to add variables to Canned Responses in Cognigy Live Agent. With variables such as agent.name, contact.first_name, and contact.email, human agents can leverage Canned Responses for improved productivity without compromising on personalization. Simply type {{ in the Canned Response, and select the desired variable from the list, empowering agents to deliver tailored and efficient responses.

Canned Response Variables

Other Improvements for Cognigy.AI

Cognigy Virtual Agents

  • Improved by renaming "GPT Prompt" Node to "LLM Prompt"
  • Improved by adding a toggle called "Accept Conversation Active Event" to the 8x8 Endpoint settings
  • Improved by making xApp submit payload collapsible.
  • Improved by adding a default token for generative AI output.
  • Improved by adding MSAL authentication support with legacy ADAL way of token exchange between bot and ABS

Cognigy Insights

  • Improved by changing the style and text of collapsable data content of the transcript and message explorers

Cognigy Live Agent

  • Improved by changing the time setting for auto-resolve conversations from hours to minutes

For further information, check out our complete Release Notes here.

image119-1
image119-1
image119-1