- Home
- Orchard Core
- Enhancing OrchardCore AI with the Model Context Protocol (MCP)
Enhancing OrchardCore AI with the Model Context Protocol (MCP)

Artificial Intelligence has come a long way — from simple autocomplete suggestions to powerful, conversational agents. But here’s the catch: even the smartest AI model can’t read your mind.
Context is everything.
Give an LLM a vague prompt, and you’ll get a vague response. But give it rich, structured, real-time context — and it becomes capable of surprising intelligence.
That’s the promise behind the Model Context Protocol (MCP) — and now, thanks to the CrestApps MCP Module, this power is available directly inside your OrchardCore applications.
Whether you're building a chatbot, a content assistant, or a smart dashboard, this new module makes it easy to extend AI behavior by plugging in external tools, services, and even your own backend logic — all without reinventing the wheel.
Let’s explore how MCP is changing the game for AI integrations and how you can get started today.
🤖 What is Model Context Protocol (MCP) and Why It’s a Game Changer
At its core, MCP is a simple but powerful idea: what if there was a universal way for AI models to talk to external systems?
Think of it like the USB-C for AI. With USB-C, you can connect your laptop to power, displays, or storage — no matter the brand. MCP works the same way for AI: it defines a standard protocol that lets large language models connect to tools, data, and services, no matter where they live.
So, what does MCP actually enable?
- 🧩 Pluggable Tools: Attach prebuilt tools like
mcp/time
,mcp/calc
, ormcp/shell
that run locally or in containers. - 🌐 Real-Time Context: Provide your AI with live data from internal APIs, databases, or system commands.
- 🧠 Cross-Model Compatibility: Use the same tools across OpenAI, Ollama, Claude, and more.
- 🔒 Privacy & Control: Run tools locally or behind your firewall — no need to send data to the cloud.
It’s not just about better responses — it’s about making AI feel integrated into your app’s ecosystem, aware of your user, your data, and your environment.
Learn more: modelcontextprotocol.io/introduction
🧩 Extending OrchardCore AI with MCP
With the CrestApps AI Chat module, you already have a great foundation for building AI-powered experiences in OrchardCore. You can create profiles, hook into your preferred LLM provider, and start chatting with your model right away.
But let’s imagine a common scenario.
🧪 Real-World Example: Why Static AI Isn’t Enough
Let’s say you’ve created an AI profile that connects to your favorite LLM provider — great! Now you type something like:
“What is the current time in Bethlehem, Palestine?”
The model, however, responds with:
“I’m not sure — I don’t have access to real-time data.”
That’s because most language models — even the smartest ones — aren’t connected to the real world. They’re like extremely well-read introverts: lots of knowledge, no access to what’s happening right now.
This is where MCP comes to the rescue.
By connecting your OrchardCore AI profile to a remote or local MCP server — such as one that runs the mcp/time
tool — your AI gains the ability to fetch real-time timezone information. The server does the lookup, returns the current time for Palestine, and the model weaves it into its answer naturally.
It feels like magic — but it’s just structured context delivered through MCP.
⚙️ Getting Started with the CrestApps MCP Module
The CrestApps.OrchardCore.AI.Mcp
module extends the AI Chat capabilities by allowing your OrchardCore site to interact with MCP servers via:
- 🌐 Remote MCP Servers using Server-Sent Events (SSE)
- 🖥 Local MCP Tools via Standard Input/Output (Stdio)
To get started:
- Install the
CrestApps.OrchardCore.AI.Mcp
NuGet package in your web project. - In the OrchardCore dashboard, go to Tools → Features
- Enable one or both of:
Model Context Protocol (MCP) Client
Model Context Protocol (Local MCP) Client
🌐 Connecting to a Remote MCP Server (SSE Transport)
You can integrate any external MCP Server using Server-Sent Events (SSE), a lightweight, real-time communication method.
🛠 Step-by-Step:
- Go to Artificial Intelligence → MCP Connections
- Click Add Connection
- Under SSE Source, click Add
- Fill in details like:
Display Text: Remote AI Time Server
Endpoint: https://localhost:1234/
Additional Headers: (optional)
- Click Save
🎯 Then go to AI Profiles and create a new profile using this connection.
🧾 Alternative: Recipe-Based Setup for Remote Server
You can also define the connection via a recipe file:
{
"steps": [
{
"name": "McpConnection",
"connections": [
{
"DisplayText": "Remote AI Tools",
"Properties": {
"SseMcpConnectionMetadata": {
"Endpoint": "https://localhost:1234/",
"AdditionalHeaders": {}
}
}
}
]
}
]
}
This is handy for multi-tenant solutions, automation, or CI/CD pipelines.
🖥 Using the Local MCP Client (Stdio Transport)
Want your AI to work offline or integrate with local tools? Use the Local MCP Client
, which runs tools via standard input/output — especially useful with Dockerized tools like:
- 🕒
mcp/time
(returns current time with time zone awareness) - ➕
mcp/calc
(evaluates math) - 🖥
mcp/shell
(secure command runner)
🧭 Example: Use mcp/time
via Docker
Step 1: Install Docker Desktop
Download and install Docker Desktop if you haven't already.
Step 2: Pull the Image
docker pull mcp/time
Step 3: Add the Connection in OrchardCore
- Go to AI → MCP Connections
- Click Add Connection
- Choose Stdio Source and click Add
- Enter:
Display Text: Global Time Capabilities
Command: docker
Arguments: ["run", "-i", "--rm", "mcp/time"]
Click Save, and you’re ready to create a profile that uses this connection.
🧾 Recipe-Based Setup for Local Tools
Prefer automation? Here's how to configure the same connection via a recipe:
{
"steps": [
{
"name": "McpConnection",
"connections": [
{
"DisplayText": "Global Time Capabilities",
"Properties": {
"StdioMcpConnectionMetadata": {
"Command": "docker",
"Arguments": [
"run",
"-i",
"--rm",
"mcp/time"
]
}
}
}
]
}
]
}
🧠 More Creative Use Cases
Once MCP is connected, you can get creative with your AI tooling. Here are a few ideas:
🧮 Natural Language Calculator with
mcp/calc
Evaluate math queries like “What’s 25% of 480?”📂 File Search Assistant using
mcp/shell
Help users locate files on disk using AI-generated search commands.🕵️ Internal Monitoring Bot
Connect to internal logs, status scripts, or diagnostic tools.
All of these can be exposed to your AI model using MCP’s consistent, secure protocol.
🌎 Where to Find MCP-Compatible Tools
MCP is growing fast. Explore these hubs for tools and servers:
✅ Wrapping Up
The Model Context Protocol is redefining how AI applications get their context — and with the CrestApps MCP Module, that power is now available to OrchardCore developers with minimal setup and maximum flexibility.
Whether you're looking to add intelligence to your CMS, build AI-powered assistants, or just give your model access to real-world data — MCP offers a clean, scalable, and secure path forward.
Want to try it? 👉 Get started with the CrestApps MCP Module
Let me know if you'd like the Markdown version for publishing or want to add a diagram or code demo!