AI Solutions
AI Solutions
AI for Work

Search across silos. Automate workflows. Orchestrate AI agents. Govern with confidence.

learn more
features
Enterprise SearchIntelligent OrchestratorPre-Built AI AgentsAdmin ControlsAI Agent Builder
Departments
SalesMarketingEngineeringLegalFinance
PRE-BUILT accelerators
HRITRecruiting
AI for Service

Leverage Agentic capabilities to empower customers and create personalized experiences.

learn more
features
AI agentsAgent AI AssistanceAgentic Contact CenterQuality AssuranceProactive Outreach
PRE-BUILT accelerators
RetailBankingHealthcare
AI for Process

Streamline knowledge-intensive business processes with autonomous AI agents.

learn more
features
Process AutomationAI Analytics + MonitoringPre-built Process Templates
Use Cases
Zero-Touch IT Operations Management
Top Resources
Scaling AI: practical insights
from AI leaders
AI use cases: insights from AI's leading decision makers
Beyond AI islands: how to fully build an enterwise-wide AI workforce
QUICK LINKS
About Kore.aiCustomer StoriesPartnersResourcesBlogWhitepapersDocumentationAnalyst RecognitionGet supportCommunityAcademyCareersContact Us
Agent Platform
Agent Platform
Agent Platform

Your strategic enabler for enterprise AI transformation.

learn more
FEATURES
Multi-Agent Orchestration
AI Engineering Tools
Search + Data AI
AI Security + Governance
No-Code + Pro-Code Tools
Integrations
GET STARTED
AI for WorkAI for ServiceAI for ProcessAgent Marketplace
LEARN + DISCOVER
About Kore.aiCustomer StoriesPartnersResource HubBlogWhitepapersAI Research ReportsNewsroomAnalyst RecognitionDocumentationGet supportAcademy
GET INVOLVED
AI PulseEventsCommunityCareersContact Us
upcoming event

HIMSS (Healthcare Information and Management Systems Society) is a global advisor, thought leader and member-based society committed to reforming the global health ecosystem through the power of information and technology.

Las Vegas
12 Mar
register
Recent AI Insights
The AI productivity paradox: why employees are moving faster than enterprises
The AI productivity paradox: why employees are moving faster than enterprises
AI INSIGHT
12 Jan 2026
The Decline of AI Agents and Rise of Agentic Workflows
The Decline of AI Agents and Rise of Agentic Workflows
AI INSIGHT
01 Dec 2025
AI agents and tools: Empowering intelligent systems for real world impact
AI agents and tools: Empowering intelligent systems for real world impact
AI INSIGHT
12 Nov 2025
Agent Marketplace
More
More
Resources
Resource Hub
Blog
Whitepapers
Webinars
AI Research Reports
AI Glossary
Videos
AI Pulse
Generative AI 101
Responsive AI Framework
CXO Toolkit
support
Documentation
Get support
Submit RFP
Academy
Community
COMPANY
About us
Leadership
Customer Stories
Partners
Analyst Recognition
Newsroom
Events
Careers
Contact us
Agentic AI Guides
forrester cx wave 2024 Kore at top
Kore.ai named a leader in The Forrester Wave™: Conversational AI for Customer Service, Q2 2024
Generative AI 101
CXO AI toolkit for enterprise AI success
upcoming event

HIMSS (Healthcare Information and Management Systems Society) is a global advisor, thought leader and member-based society committed to reforming the global health ecosystem through the power of information and technology.

Las Vegas
12 Mar
register
Talk to an expert
Not sure which product is right for you or have questions? Schedule a call with our experts.
Request a Demo
Double click on what's possible with Kore.ai
Sign in
Get in touch
Background Image 1
Blog
Conversational AI
Valuable Benefits of Training Intelligent Virtual Assistants to Address FAQs

Valuable Benefits of Training Intelligent Virtual Assistants to Address FAQs

Published Date:
August 13, 2022
Last Updated ON:
December 2, 2025

If you've ever spent time browsing the internet, chances are you've come across an 'FAQ' section - a vital component of a knowledge base. FAQ sections provide answers to the most frequently asked questions posed by customers, proving beneficial to them at every step of their journey. With AI increasing in popularity, organizations are increasingly leveraging intelligent virtual assistants to enhance their knowledge base for better customer service.

However, creating an AI-powered chatbot that can effectively handle FAQs is no small feat - it requires careful planning and training to ensure that the bot can anticipate and answer all of the questions a user may have.

Valuable Benefits of Training Intelligent Virtual Assistants to Address FAQs

Training AI chatbots on FAQs is a crucial element in providing efficient and effective customer service. By anticipating and answering frequently asked questions, intelligent virtual assistants (IVAs) can help users find the information that customers need quickly and easily, reducing frustration and improving the overall user experience. 

With the ability to quickly and accurately answer common customer questions, IVAs can also help reduce the workload of live customer support agents and improve overall customer satisfaction. By automating the handling of frequently asked questions, intelligent virtual assistants can also provide 24/7 support, improve response times, and increase the overall efficiency of customer service operations.

You can do this by leveraging your Knowledge Graph Engine to build the FAQ Repository and we’ll discuss the main steps to building this out yourself. 

Creating A Knowledge Graph

The Kore.ai XO Platform's Knowledge Graph (KG) helps you take static FAQ text and transform it into an intelligent, personalized conversation. It surpasses the conventional method of just presenting FAQs as straightforward question-answer pairs. The Knowledge Graph enables you to create a hierarchical structure of key terms and associate them with context-specific questions, including their alternative terms, synonyms, and more.

To generate a Knowledge Graph, you need to add FAQs to an existing or new IVA. If you're interested in comparing the two types, see the documentation detailing the Ontology Knowledge Graph and the Few-Shot Knowledge Graph.

You can find the Knowledge Graph in the Knowledge AI section under ‘Conversational Skills’ inside the bot builder. From there, you can get started.

Knowledge Graph Terminology.

Understanding the terminology related to the Knowledge Graph can be tricky, but we've explained some of the terms below for your convenience. Can't find a term you're looking for? Check out the Knowledge Graph Terminology Page. 

1. Intent

‍the objective or goal a user has when engaging in a conversation with a chatbot or intelligent virtual assistant. It's essentially what the user intends to achieve from the conversation, such as booking a flight, making a purchase, or seeking information.

2. Ontology

a set of concepts and categories in a subject area or domain that shows their properties and the relations between them. Another way to look at Ontology is that when we have common topics that fall underneath a certain umbrella, we can add all of those FAQ’s to that umbrella to provide a higher degree of accuracy when we’re receiving FAQ’s from users.

3. Terms or Nodes

the building blocks of an ontology and are used to define the fundamental concepts and categories of a Knowledge Graph.

Root Term/Node: forms the topmost term of your Ontology. A Knowledge Graph contains only one root node, and all other nodes in the ontology become its child nodes. The Root node takes the name of the VA by default, but you can change it as needed.

4. First-level Term/Node

‍the immediate next-level nodes after the Root node. There can be any number of first-level nodes in a graph. We recommend using first-level nodes to represent high-level terms, such as the names of departments, functionalities, etc. For example, in a Travel Assistant, you might have a first-level node called Reservation, which can be structured by functionality into sub-nodes such as: Cancel and Update.

5. Leaf Term/Node

‍Any node at any level starting with the 2nd is called a Leaf Term/Node.

Determine Intents

Before you start building your FAQs, it's important to identify which user intents are most beneficial by understanding the main goals your users are aiming to achieve in a customer service interaction. By identifying these goals, you can create FAQs that directly address these intents, enhancing user experience and satisfaction. This approach ensures that your FAQs are not just random questions but strategically designed tools that add value to your users' journey.

The new XO Platform Intent Discovery beta module helps you auto-extract popular intents from previous user conversations. It reduces the time and effort to build a virtual assistant and leads to the success of your Conversational AI Journey. This beta feature is currently only supported in English and is only available for enterprise users. 

You can upload your historical transcripts in CSV format. After the transcripts are uploaded into the bot, the bot uses LLMs to identify different topics, intents, or conversations available between the user and the bot. You can review each intent to understand which conversations have resulted in identifying these intents. After the review, you can also see the underline utterances that resulted in identifying an intent. You can either add these intents as new intents for your intelligent virtual assistant or pick specific utterances and train them as utterances for your existing dialogs and FAQs. So, it helps both ways – either create new intents or enhance the training you provide to your virtual assistant.

Creating a Node

If you are starting from scratch, creating the Knowledge Graph Node Structure is the first step. By default, the name of the IVA becomes the root node of the hierarchy but you can edit this. Create the rest of the nodes below the root node. 

To create nodes, follow the below steps:

  1. Open the Knowledge Graph.
  2. On the top left of the Knowledge Graph window, hover over the root node.
  3. Click the + icon. A text box appears below to Add Node.
  4. Type the name of the node in the text box and press Enter.
  5. Repeat steps 1 to 3 to create other First-level nodes.
  6. After you create First-level nodes, create child nodes as follows:
    • Hover over any First-level node, and click the plus icon to create its child node.
    • You can create a child node for any level node by hovering over it and clicking the + icon.

Follow the same process to create multiple node levels. 

The next step is to add Knowledge Graph Intents which can be either:

  • FAQ – to answer user queries
  • Task – to execute a dialog task.

*Note: For better performance, there is a restriction of 50k FAQs spread across 20k maximum allowed number of nodes.

Adding FAQs

To add an FAQ, follow the below steps:

1. On the left pane of the Knowledge Graph window, click the node to which you want to add questions.

2. Click Add Intent on the top-right.

3. On the Intent window, under the Intent section, select FAQ.

4. Optionally, enter a Display Name. This name will be used for presenting the FAQ to the end-users.

5. In the Add Question field, enter the question that describes the user’s query.

6. Optionally, if there are alternatives to the same question, add them in the Add Alternate FAQ field. Repeat the step for all the alternative questions you want to add.

7. You can use patterns to define the FAQs. This can be done by preceding the pattern with || (two vertical bars) in the alternate question field. The Platform marks these as patterns and evaluates them accordingly (see here for more on patterns).

8. For each question, you can add terms that will serve as tags in better identifying the question by the Knowledge Graph Engine.

7. You can set the Intent Status as enabled or disabled for the FAQ intents. The Knowledge Graph will not use the FAQs intents that are in the enabled state. These intents will not participate in the intent recognition process during testing and end-user interaction.

8. You can also set Term Status as enabled or disabled. The Knowledge Graph will only use the terms that are in the enabled state. The terms marked as disabled and all their FAQ intents do not participate in the intent recognition process during testing and end-user interaction.

9. You can also add a Reference Id. This field can be used to add a reference to any external content used as a source for this FAQ.

10. As you enter these questions, pay attention to terms that you can further add to your FAQ hierarchy. 

Manage Bot Responses

For the intelligent virtual assistant response, you can compose simple or complex channel-specific replies. 

  • Standard: The prompt defined when adding a node in Dialog Builder is the standard or default prompt. When multiple standard prompts are defined for a node, the Platform chooses a random one to display to the end-user.
  • Channel-Specific: Use prompts can be defined for specific channels such as email, SMS, Twitter, and more. By varying the responses, you can make the language and formatting leverage the strengths of the selected channels. To add a channel-specific response, select the channel from the channels list before typing the response.

You can use either a basic or advanced editor to edit responses, refer to the guide here for more on user prompts.  

*Note: We recommend you add one response for All Channels so that it can be used in the absence of a channel-specific response.

Sometimes the responses to the FAQ are quite lengthy or may include nice-to-have information along with the primary response. To improve the readability of such responses, you can split information into multiple responses that go as separate messages one after another by clicking Add Extended Response on the top-right of the Bot Response window.

Optionally, you can use Add Alternate Response if your question can have more than one answer. Repeat the step for all the alternative responses you want to add. At runtime, the platform will pick one response at random.

You can learn more about managing FAQs here. 

Execute Dialog Tasks

You can link a dialog task to a Knowledge Graph Intent. It helps you to leverage the capabilities of the Knowledge Graph and dialog tasks to handle FAQs that involve complex conversations.

  1. On the Intent window, under the Intent section, select Task.
  2. Optionally, enter a Display Name. This name will be used for presenting the FAQ to the end-users in case of ambiguity.
  3. Select a task from the drop-down list. You can Add Utterance that triggers this task.
  4. If multiple utterances mean the same, Add Alternate Utterance.
  5. You can also add a Reference Id. This field can be used to add a reference to any external content used as a source for this FAQ.
  6. Click Save.

Improving Knowledge Graph Performance

The Knowledge Graph engine works well with the default settings, but you can fine-tune the KG engine performance.

Here are a few guidelines:

  1. Configure the Knowledge Graph by defining terms, synonyms, primary and alternative questions, or user utterances. Though hierarchy does not affect the KG engine performance, it does help organize and guide your knowledge implementation.

  2. Set the following parameters:
    1. Path Coverage – For Ontology-based graphs, you can define the minimum percentage of terms in the user’s utterance to be present in a path to qualify it for further scoring.
    2. Definite Score for KG – Define the minimum score for a KG intent match to consider as a definite match and discard any other intent matches found.
    3. Minimum and Definitive Level for Knowledge Tasks – Define minimum and definitive threshold to identify and respond in case of a knowledge task.
    4. KG Suggestions Count – Define the maximum number of KG/FAQ suggestions to present when a definite KG intent match is unavailable.
    5. The proximity of Suggested Matches – Define the maximum difference to allow between top-scoring and immediate next suggested questions to consider as equally important.
    6. Qualify Contextual Paths – This ensures that the bot context is populated and retained with the terms/nodes of the matched intent. This further enhances the user experience.

*Note: You can customize these settings in Natural Language > Thresholds & Configurations. 

  1. Traits – Traits qualify nodes/terms even if the user utterance does not contain the term/node. Traits are also helpful in filtering the suggested intent list.

Tips for Building a Knowledge Graph 

From the Knowledge Graph, follow these steps to build and train the corresponding Knowledge Graph:

  • Identify terms by grouping the unique words in each FAQ question. Build a hierarchy based on all such unique words.
  • Ensure that each node has not more than 25 questions.
  • Associate traits with terms to enable filtering FAQs from multiple identified results.
  • Define synonyms for each term/node in the hierarchy. Ensure that all the different ways to call the term are defined.
  • Depending on the importance of each term in a path, mark them as either mandatory or regular.
  • Define alternative questions for each FAQ to ensure better coverage.
  • Manage context for accurate response.
  • Use Stop Words to filter unwanted utterances.

To continue improving your Knowledge Graph engine, see our step-by-step process for Knowledge Graph Training for more details on fine-tuning your intelligent virtual assistant.

Learn more
Share
Link copied
authors
No items found.
Forrester logo at display.
Kore.ai named a leader in the Forrester Wave™ Cognitive Search Platforms, Q4 2025
Access Report
Gartner logo in display.
Kore.ai named a leader in the Gartner® Magic Quadrant™ for Conversational AI Platforms, 2025
Access Report
Stay in touch with the pace of the AI industry with the latest resources from Kore.ai

Get updates when new insights, blogs, and other resources are published, directly in your inbox.

Subscribe
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Recent Blogs

View all
 How AI in the workplace is rewriting enterprise culture
February 2, 2026
How AI in the workplace is rewriting enterprise culture
AI in the Workplace: What’s Actually Working in 2026 | Kore.ai
January 30, 2026
AI in the Workplace: What’s Actually Working in 2026 | Kore.ai
Agentic AI in Retail: Transforming Customer Experience & Operations 
January 23, 2026
Agentic AI in Retail: Transforming Customer Experience & Operations 
Start using an AI agent today

Browse and deploy our pre-built templates

Marketplace
Reimagine your business

Find out how Kore.ai can help you today.

Talk to an expert
Background Image 4
Background Image 9
You are now leaving Kore.ai’s website.

‍

Kore.ai does not endorse, has not verified, and is not responsible for, any content, views, products, services, or policies of any third-party websites, or for any verification or updates of such websites. Third-party websites may also include "forward-looking statements" which are inherently subject to risks and uncertainties, some of which cannot be predicted or quantified. Actual results could differ materially from those indicated in such forward-looking statements.



Click ‘Continue’ to acknowledge the above and leave Kore.ai’s website. If you don’t want to leave Kore.ai’s website, simply click ‘Back’.

CONTINUEGO BACK
Reimagine your enterprise with Kore.ai
English
Spanish
Spanish
Spanish
Spanish
Get Started
AI for WorkAI for ServiceAI for ProcessAgent Marketplace
Kore.ai agent platform
Platform OverviewMulti-Agent OrchestrationAI Engineering ToolsSearch and Data AIAI Security and GovernanceNo-Code and Pro-Code ToolsIntegrations
ACCELERATORS
BankingHealthcareRetailRecruitingHRIT
company
About Kore.aiLeadershipCustomer StoriesPartnersAnalyst RecognitionNewsroom
resources
DocumentationBlogWhitepapersWebinarsAI Research ReportsAI GlossaryVideosGenerative AI 101Responsive AI frameworkCXO Toolkit
GET INVOLVED
EventsSupportAcademyCommunityCareers

Let’s work together

Get answers and a customized quote for your projects

Submit RFP
Follow us on
© 2026 Kore.ai Inc. All trademarks are property of their respective owners.
Privacy PolicyTerms of ServiceAcceptable Use PolicyCookie PolicyIntellectual Property Rights
|
×