AI Solutions
AI Solutions
AI for Work

Search across silos. Automate workflows. Orchestrate AI agents. Govern with confidence.

learn more
features
Enterprise SearchIntelligent OrchestratorPre-Built AI AgentsAdmin ControlsAI Agent Builder
Departments
SalesMarketingEngineeringLegalFinance
PRE-BUILT accelerators
HRITRecruiting
AI for Service

Leverage Agentic capabilities to empower customers and create personalized experiences.

learn more
features
AI agentsAgent AI AssistanceAgentic Contact CenterQuality AssuranceProactive Outreach
PRE-BUILT accelerators
RetailBankingHealthcare
AI for Process

Streamline knowledge-intensive business processes with autonomous AI agents.

learn more
features
Process AutomationAI Analytics + MonitoringPre-built Process Templates
Use Cases
Zero-Touch IT Operations Management
Top Resources
Scaling AI: practical insights
from AI leaders
AI use cases: insights from AI's leading decision makers
Beyond AI islands: how to fully build an enterwise-wide AI workforce
QUICK LINKS
About Kore.aiCustomer StoriesPartnersResourcesBlogWhitepapersDocumentationAnalyst RecognitionGet supportCommunityAcademyCareersContact Us
Agent Platform
Agent Platform
Agent Platform

Your strategic enabler for enterprise AI transformation.

learn more
FEATURES
Multi-Agent Orchestration
AI Engineering Tools
Search + Data AI
AI Security + Governance
No-Code + Pro-Code Tools
Integrations
GET STARTED
AI for WorkAI for ServiceAI for ProcessAgent Marketplace
LEARN + DISCOVER
About Kore.aiCustomer StoriesPartnersResource HubBlogWhitepapersAI Research ReportsNewsroomAnalyst RecognitionDocumentationGet supportAcademy
GET INVOLVED
AI PulseEventsCommunityCareersContact Us
upcoming event

CCW Berlin brings together international experts, visionary speakers, and leading companies to explore the future of customer experience, AI, and digital transformation in a dynamic blend of congress and exhibition

Berlin
4 Feb
register
Recent AI Insights
The AI productivity paradox: why employees are moving faster than enterprises
The AI productivity paradox: why employees are moving faster than enterprises
AI INSIGHT
12 Jan 2026
The Decline of AI Agents and Rise of Agentic Workflows
The Decline of AI Agents and Rise of Agentic Workflows
AI INSIGHT
01 Dec 2025
AI agents and tools: Empowering intelligent systems for real world impact
AI agents and tools: Empowering intelligent systems for real world impact
AI INSIGHT
12 Nov 2025
Agent Marketplace
More
More
Resources
Resource Hub
Blog
Whitepapers
Webinars
AI Research Reports
AI Glossary
Videos
AI Pulse
Generative AI 101
Responsive AI Framework
CXO Toolkit
support
Documentation
Get support
Submit RFP
Academy
Community
COMPANY
About us
Leadership
Customer Stories
Partners
Analyst Recognition
Newsroom
Events
Careers
Contact us
Agentic AI Guides
forrester cx wave 2024 Kore at top
Kore.ai named a leader in The Forrester Wave™: Conversational AI for Customer Service, Q2 2024
Generative AI 101
CXO AI toolkit for enterprise AI success
upcoming event

CCW Berlin brings together international experts, visionary speakers, and leading companies to explore the future of customer experience, AI, and digital transformation in a dynamic blend of congress and exhibition

Berlin
4 Feb
register
Talk to an expert
Not sure which product is right for you or have questions? Schedule a call with our experts.
Request a Demo
Double click on what's possible with Kore.ai
Sign in
Get in touch
Background Image 1
Blog
Conversational AI
No-code deployment and orchestration of open-sourced foundation models

No-code deployment and orchestration of open-sourced foundation models

Published Date:
July 26, 2024
Last Updated ON:
November 19, 2025

No-Code Deployment Of Open-Sourced Foundation Models Open-Sourced models from Argilla, EleutherAl, Facebook, Google, HuggingFace, Mistral Al, Meta T5, Tiiuae, and more...

Introduction

There is a whole host of open-sourced language models, both Large Language Models (LLMs) and Small Language Models (SLMs). Being able to host these models, and exposing them via an API or making use of the model via process automation flows is a significant enabler for building Generative AI applications.

The GALE AI productivity suite is a very good example of how models can be explored, managed, deployed and used within a no-code environment.

Advantages of open-sourced models

In principle, open-sourced models should be more accessibility than commercial models. And by leveraging in-context learning (ICL) less capable models are as efficient if not more than commercial models. So even-though open-source LLMs are freely available to anyone, making them accessible to a wide range of users, including researchers, developers, and organisation demand hosting, and exposing the model via a managed API. Customisation and fine-tuning of the model are possible via GALE. Users have the freedom to customise and modify open-source LLMs according to their specific needs and preferences. This flexibility allows for the development of tailored solutions and applications that address unique use cases.

I would argue that part of transparency is the fact that model drift is a real threat to production implementations. This is the scenario where the underlying model changes over time, without any notice to the user. Together with model drift, there are also factors like Catastrophic Forgetting which have been documented in relation to OpenAI models. Having a privately hosted open-source model instance guards against these problems. Open-source LLMs can facilitate rapid innovation by allowing users to leverage existing models, datasets, and tools as building blocks for new applications and research projects. This accelerates the pace of development and drives advancements in natural language processing technology.

Model orchestration

Using the Right Model for the Right Task: Task-Specific Models

Different tasks in NLP, such as text classification, language translation, and sentiment analysis, may require specialised models optimised for their respective tasks. By selecting the appropriate model for each task, organisations can achieve better performance and accuracy. When choosing the right model for a task, factors such as model architecture, pre-training data, fine-tuning techniques, and computational resources must be considered. Using and orchestrating multiple LLMs can improve performance by leveraging the strengths of different models at various stages of the application flow. Model orchestration methods combine predictions from multiple models to produce a more robust and accurate output, particularly in tasks where individual models may have limitations or biases. Orchestration frameworks can dynamically select the most suitable model for each task based on factors such as task requirements, model performance, and resource availability. This adaptive approach ensures that the system can effectively handle a wide range of tasks and adapt to changing conditions over time. Orchestration can easily be achieved with a flow builder as seen below, where different models can be based on different scenarios and conditions.

Dashboard screenshot featuring multiple items and data points for performance tracking.
Agent flows shown in kore ai's productivity suite

Dashboard screenshot featuring multiple items and data points for performance tracking.

In summary, large language model orchestration involves managing the deployment, scaling, and optimisation of LLMs, while using the right model for the right task involves selecting appropriate models based on task requirements and optimising performance through dynamic model selection and ensemble learning techniques.

Accelerated Generative AI adoption

Large language models (LLMs) are accessible through the three options depicted in the image below. Various LLM-based User Interfaces, such as HuggingChat, Cohere Coral, and ChatGPT, offer conversational interfaces where the UI can learn user preferences in certain cases. Cohere Coral facilitates document and data uploads to serve as contextual references. LLM APIs represent the most popular method for organisations and enterprises to leverage LLMs. The market offers numerous commercial offerings, as detailed in the accompanying image. While LLM APIs are the simplest means to develop Generative Apps, they present challenges including cost, data privacy, inference latency, rate limits, catastrophic forgetting, model drift, and more. Several open-source raw models are available for free use, although implementing and operating them requires specialised knowledge. Additionally, hosting costs will increase with adoption.

Visual representation of different LMPS types, showcasing their unique characteristics and categories.
Types of LLMs

Local and private hosting of SLMs which are fit for purpose solves most of these challenges.

Model access via GALE

Below is a matrix of open-sourced models which are available via the GALE productivity suite. The number of open-sourced models are sure to grow in number.

A calendar page featuring a detailed list of tasks, organized by date, with checkmarks for completed items.
Open-sourced models in kore ai's productivity suite

Considering the image below of GALE, there are three options available, fine-tuned models, open-source models or add external models. In this example, the open-source models option is selected. A list of available models within GALE are displayed, which can also be searched. In the example below I selected the t5-small model, which deploys within a few minutes. GALE notified me via email the moment the model was deployed. Models can be managed in terms of status, being deployed, un-deployed and deleted.

Once deployed, as seen below, a few options are available, a model endpoint is immediately available with different parameters being available. The Python code example given below, I could copy directly from the GALE user interface, and paste it into a notebook.

The deployed model is also accessible via the playground for experimentation. Multiple models can be compared in parallel, using the same input and prompt.

Below the models available to me is shown via the dropdown…

Conclusion

The importance of an AI productivity suite lies in its ability to empower individuals and organisations to achieve their goals faster, smarter, and with greater precision. By harnessing the power of AI to automate tasks, personalise experiences, and drive insights, a productivity suite becomes an indispensable asset in creating, deploying and managing generative AI applications.

Request a demo
Share
Link copied
authors
Cobus Greyling
Cobus Greyling
Chief Evangelist
Forrester logo at display.
Kore.ai named a leader in the Forrester Wave™ Cognitive Search Platforms, Q4 2025
Access Report
Gartner logo in display.
Kore.ai named a leader in the Gartner® Magic Quadrant™ for Conversational AI Platforms, 2025
Access Report
Stay in touch with the pace of the AI industry with the latest resources from Kore.ai

Get updates when new insights, blogs, and other resources are published, directly in your inbox.

Subscribe
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Recent Blogs

View all
Agentic AI in Retail: Transforming Customer Experience & Operations 
January 23, 2026
Agentic AI in Retail: Transforming Customer Experience & Operations 
Top Glean Alternatives (2026 Guide)
January 23, 2026
Top Glean Alternatives (2026 Guide)
AI Agents in 2026: From Hype to Enterprise Reality
January 16, 2026
AI Agents in 2026: From Hype to Enterprise Reality
Start using an AI agent today

Browse and deploy our pre-built templates

Marketplace
Reimagine your business

Find out how Kore.ai can help you today.

Talk to an expert
Background Image 4
Background Image 9
You are now leaving Kore.ai’s website.

‍

Kore.ai does not endorse, has not verified, and is not responsible for, any content, views, products, services, or policies of any third-party websites, or for any verification or updates of such websites. Third-party websites may also include "forward-looking statements" which are inherently subject to risks and uncertainties, some of which cannot be predicted or quantified. Actual results could differ materially from those indicated in such forward-looking statements.



Click ‘Continue’ to acknowledge the above and leave Kore.ai’s website. If you don’t want to leave Kore.ai’s website, simply click ‘Back’.

CONTINUEGO BACK
Reimagine your enterprise with Kore.ai
English
Spanish
Spanish
Spanish
Spanish
Get Started
AI for WorkAI for ServiceAI for ProcessAgent Marketplace
Kore.ai agent platform
Platform OverviewMulti-Agent OrchestrationAI Engineering ToolsSearch and Data AIAI Security and GovernanceNo-Code and Pro-Code ToolsIntegrations
ACCELERATORS
BankingHealthcareRetailRecruitingHRIT
company
About Kore.aiLeadershipCustomer StoriesPartnersAnalyst RecognitionNewsroom
resources
DocumentationBlogWhitepapersWebinarsAI Research ReportsAI GlossaryVideosGenerative AI 101Responsive AI frameworkCXO Toolkit
GET INVOLVED
EventsSupportAcademyCommunityCareers

Let’s work together

Get answers and a customized quote for your projects

Submit RFP
Follow us on
© 2026 Kore.ai Inc. All trademarks are property of their respective owners.
Privacy PolicyTerms of ServiceAcceptable Use PolicyCookie PolicyIntellectual Property Rights
|
×