Skip to content

Commit 016d809

Browse files
Merge pull request #532 from MicrosoftDocs/main
added MCP docs
2 parents ec25939 + 41544e9 commit 016d809

File tree

5 files changed

+316
-37
lines changed

5 files changed

+316
-37
lines changed

semantic-kernel/concepts/ai-services/chat-completion/function-calling/index.md

Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -214,6 +214,7 @@ When using auto function calling in KernelFunctions, certain parameter names are
214214
### Reserved Names
215215

216216
The following parameter names are reserved:
217+
217218
- `kernel`
218219
- `service`
219220
- `execution_settings`
@@ -242,6 +243,37 @@ class SimplePlugin:
242243
return f"Received user input: {location}, the weather is nice!"
243244
```
244245

246+
## Custom Reserved Parameter Names for Auto Function Calling
247+
248+
You can also customize this behavior. In order to do that, you need to annotate the parameter you want to exclude from the function calling definition, like this:
249+
250+
```python
251+
class SimplePlugin:
252+
@kernel_function(name="GetWeather", description="Get the weather for a location.")
253+
async def get_the_weather(self, location: str, special_arg: Annotated[str, {"include_in_function_choices": False}]) -> str:
254+
# The 'special_arg' parameter is reserved and you need to ensure it either has a default value or gets passed.
255+
```
256+
257+
When calling this function, make sure to pass the `special_arg` parameter, otherwise it will raise an error.
258+
259+
```python
260+
response = await kernel.invoke_async(
261+
plugin_name=...,
262+
function_name="GetWeather",
263+
location="Seattle",
264+
special_arg="This is a special argument"
265+
)
266+
```
267+
268+
Or add it to the `KernelArguments` object to use it with auto function calling in an agent like this:
269+
270+
```python
271+
arguments = KernelArguments(special_arg="This is a special argument")
272+
response = await agent.get_response(
273+
messages="what's the weather in Seattle?"
274+
arguments=arguments)
275+
```
276+
245277
::: zone-end
246278

247279
::: zone pivot="programming-language-java"

semantic-kernel/concepts/kernel.md

Lines changed: 123 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -14,11 +14,13 @@ ms.service: semantic-kernel
1414
The kernel is the central component of Semantic Kernel. At its simplest, the kernel is a Dependency Injection container that manages all of the services and plugins necessary to run your AI application. If you provide all of your services and plugins to the kernel, they will then be seamlessly used by the AI as needed.
1515

1616
## The kernel is at the center
17+
1718
Because the kernel has all of the services and plugins necessary to run both native code and AI services, it is used by nearly every component within the Semantic Kernel SDK to power your agents. This means that if you run any prompt or code in Semantic Kernel, the kernel will always be available to retrieve the necessary services and plugins.
1819

1920
![The kernel is at the center of everything in Semantic Kernel](../media/the-kernel-is-at-the-center-of-everything.png)
2021

2122
This is extremely powerful, because it means you as a developer have a single place where you can configure, and most importantly monitor, your AI agents. Take for example, when you invoke a prompt from the kernel. When you do so, the kernel will...
23+
2224
1. Select the best AI service to run the prompt.
2325
2. Build the prompt using the provided prompt template.
2426
3. Send the prompt to the AI service.
@@ -28,12 +30,13 @@ This is extremely powerful, because it means you as a developer have a single pl
2830
Throughout this entire process, you can create events and middleware that are triggered at each of these steps. This means you can perform actions like logging, provide status updates to users, and most importantly responsible AI. All from a single place.
2931

3032
## Build a kernel with services and plugins
33+
3134
Before building a kernel, you should first understand the two types of components that exist:
3235

33-
| | Components | Description |
34-
|---|---|---|
35-
| 1 | **Services** | These consist of both AI services (e.g., chat completion) and other services (e.g., logging and HTTP clients) that are necessary to run your application. This was modelled after the Service Provider pattern in .NET so that we could support dependency injection across all languages. |
36-
| 2 | **Plugins** | These are the components that are used by your AI services and prompt templates to perform work. AI services, for example, can use plugins to retrieve data from a database or call an external API to perform actions. |
36+
| Component | Description |
37+
|---|---|
38+
| **Services** | These consist of both AI services (e.g., chat completion) and other services (e.g., logging and HTTP clients) that are necessary to run your application. This was modelled after the Service Provider pattern in .NET so that we could support dependency injection across all languages. |
39+
| **Plugins** | These are the components that are used by your AI services and prompt templates to perform work. AI services, for example, can use plugins to retrieve data from a database or call an external API to perform actions. |
3740

3841
::: zone pivot="programming-language-csharp"
3942
To start creating a kernel, import the necessary packages at the top of your file:
@@ -49,7 +52,7 @@ builder.AddAzureOpenAIChatCompletion(modelId, endpoint, apiKey);
4952
builder.Services.AddLogging(c => c.AddDebug().SetMinimumLevel(LogLevel.Trace));
5053
builder.Plugins.AddFromType<TimePlugin>();
5154
Kernel kernel = builder.Build();
52-
```
55+
```
5356

5457
::: zone-end
5558

@@ -82,6 +85,119 @@ kernel.add_plugin(
8285
)
8386
```
8487

88+
## Creating a MCP Server from your Kernel
89+
90+
We now support creating a MCP server from the function you have registered in your Semantic Kernel instance.
91+
92+
In order to do this, you create your kernel as you normally would, and then you can create a MCP server from it.
93+
94+
```python
95+
from semantic_kernel import Kernel
96+
from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion
97+
from semantic_kernel.functions import kernel_function
98+
from semantic_kernel.prompt_template import InputVariable, PromptTemplateConfig
99+
100+
kernel = Kernel()
101+
102+
@kernel_function()
103+
def echo_function(message: str, extra: str = "") -> str:
104+
"""Echo a message as a function"""
105+
return f"Function echo: {message} {extra}"
106+
107+
kernel.add_service(OpenAIChatCompletion(service_id="default"))
108+
kernel.add_function("echo", echo_function, "echo_function")
109+
kernel.add_function(
110+
plugin_name="prompt",
111+
function_name="prompt",
112+
prompt_template_config=PromptTemplateConfig(
113+
name="prompt",
114+
description="This is a prompt",
115+
template="Please repeat this: {{$message}} and this: {{$extra}}",
116+
input_variables=[
117+
InputVariable(
118+
name="message",
119+
description="This is the message.",
120+
is_required=True,
121+
json_schema='{ "type": "string", "description": "This is the message."}',
122+
),
123+
InputVariable(
124+
name="extra",
125+
description="This is extra.",
126+
default="default",
127+
is_required=False,
128+
json_schema='{ "type": "string", "description": "This is the message."}',
129+
),
130+
],
131+
),
132+
)
133+
server = kernel.as_mcp_server(server_name="sk")
134+
135+
```
136+
137+
The `server` object created above comes from the mcp package, you can extend it even further for instance by adding resources, or other features to it. And then you can bring it online, for instance to be used with Stdio:
138+
139+
```python
140+
import anyio
141+
from mcp.server.stdio import stdio_server
142+
143+
async def handle_stdin(stdin: Any | None = None, stdout: Any | None = None) -> None:
144+
async with stdio_server() as (read_stream, write_stream):
145+
await server.run(read_stream, write_stream, server.create_initialization_options())
146+
147+
anyio.run(handle_stdin)
148+
```
149+
150+
Or with SSE:
151+
152+
```python
153+
import uvicorn
154+
from mcp.server.sse import SseServerTransport
155+
from starlette.applications import Starlette
156+
from starlette.routing import Mount, Route
157+
158+
sse = SseServerTransport("/messages/")
159+
160+
async def handle_sse(request):
161+
async with sse.connect_sse(request.scope, request.receive, request._send) as (read_stream, write_stream):
162+
await server.run(read_stream, write_stream, server.create_initialization_options())
163+
164+
starlette_app = Starlette(
165+
debug=True,
166+
routes=[
167+
Route("/sse", endpoint=handle_sse),
168+
Mount("/messages/", app=sse.handle_post_message),
169+
],
170+
)
171+
172+
uvicorn.run(starlette_app, host="0.0.0.0", port=8000)
173+
```
174+
175+
### Exposing prompt templates as MCP Prompt
176+
177+
You can also leverage the different Semantic Kernel prompt templates by exposing them as MCP Prompts, like this:
178+
179+
```python
180+
from semantic_kernel.prompt_template import InputVariable, KernelPromptTemplate, PromptTemplateConfig
181+
182+
prompt = KernelPromptTemplate(
183+
prompt_template_config=PromptTemplateConfig(
184+
name="release_notes_prompt",
185+
description="This creates the prompts for a full set of release notes based on the PR messages given.",
186+
template=template,
187+
input_variables=[
188+
InputVariable(
189+
name="messages",
190+
description="These are the PR messages, they are a single string with new lines.",
191+
is_required=True,
192+
json_schema='{"type": "string"}',
193+
)
194+
],
195+
)
196+
)
197+
198+
server = kernel.as_mcp_server(server_name="sk_release_notes", prompts=[prompt])
199+
```
200+
85201
::: zone-end
86202

87203
::: zone pivot="programming-language-java"
@@ -99,8 +215,8 @@ Kernel kernel = Kernel.builder()
99215

100216
::: zone-end
101217

102-
103218
::: zone pivot="programming-language-csharp"
219+
104220
## Using Dependency Injection
105221

106222
In C#, you can use Dependency Injection to create a kernel. This is done by creating a `ServiceCollection` and adding services and plugins to it. Below is an example of how you can create a kernel using Dependency Injection.
@@ -151,6 +267,7 @@ builder.Services.AddTransient((serviceProvider)=> {
151267
::: zone-end
152268

153269
## Next steps
270+
154271
Now that you understand the kernel, you can learn about all the different AI services that you can add to it.
155272

156273
> [!div class="nextstepaction"]

semantic-kernel/concepts/plugins/TOC.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,8 @@
44
href: adding-native-plugins.md
55
- name: Adding OpenAPI plugins
66
href: adding-openapi-plugins.md
7+
- name: Adding MCP plugins
8+
href: adding-mcp-plugins.md
79
- name: Adding Logic Apps as plugins
810
href: adding-logic-apps-as-plugins.md
911
- name: Using data retrieval functions for RAG
Lines changed: 118 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,118 @@
1+
---
2+
title: Give agents access to MCP Servers
3+
description: Learn how to add plugins from a MCP Server to your agents in Semantic Kernel.
4+
zone_pivot_groups: programming-languages
5+
author: eavanvalkenburg
6+
ms.topic: conceptual
7+
ms.author: edvan
8+
ms.date: 04/15/2025
9+
ms.service: semantic-kernel
10+
---
11+
12+
# Add plugins from a MCP Server
13+
14+
MCP is the Model Context Protocol, it is an open protocol that is designed to allow additional capabilities to be added to AI applications with ease, see [the documentation](https://modelcontextprotocol.io/introduction) for more info.
15+
Semantic Kernel allows you to add plugins from a MCP Server to your agents. This is useful when you want to use plugins that are made available as a MCP Server.
16+
17+
Semantic Kernel supports both local MCP Servers, through Stdio, or servers that connect through SSE over HTTPS.
18+
19+
## Add plugins from a local MCP Server
20+
21+
To add a locally running MCP server, you can use the familiar MCP commands, like `npx`, `docker` or `uvx`, so if you want to run one of those, make sure those are installed.
22+
23+
For instance when you look into your claude desktop config, or the vscode settings.json, you would see something like this:
24+
25+
```json
26+
{
27+
"mcpServers": {
28+
"github": {
29+
"command": "docker",
30+
"args": [
31+
"run",
32+
"-i",
33+
"--rm",
34+
"-e",
35+
"GITHUB_PERSONAL_ACCESS_TOKEN",
36+
"ghcr.io/github/github-mcp-server"
37+
],
38+
"env": {
39+
"GITHUB_PERSONAL_ACCESS_TOKEN": "..."
40+
}
41+
}
42+
}
43+
}
44+
```
45+
46+
In order to make the same plugin available to your kernel or agent, you would do this:
47+
48+
::: zone pivot="programming-language-python"
49+
50+
```python
51+
import os
52+
from semantic_kernel import Kernel
53+
from semantic_kernel.connectors.mcp import MCPStdioPlugin
54+
55+
async def main():
56+
async with MCPStdioPlugin(
57+
name="Github",
58+
description="Github Plugin",
59+
command="docker",
60+
args=["run", "-i", "--rm", "-e", "GITHUB_PERSONAL_ACCESS_TOKEN", "ghcr.io/github/github-mcp-server"],
61+
env={"GITHUB_PERSONAL_ACCESS_TOKEN": os.getenv("GITHUB_PERSONAL_ACCESS_TOKEN")},
62+
) as github_plugin:
63+
kernel = Kernel()
64+
kernel.add_plugin(github_plugin)
65+
# Do something with the kernel
66+
```
67+
68+
An SSE-based MCP server is even simpler as it just needs the URL:
69+
70+
```python
71+
import os
72+
from semantic_kernel import Kernel
73+
from semantic_kernel.connectors.mcp import MCPSsePlugin
74+
75+
async def main():
76+
async with MCPSsePlugin(
77+
name="Github",
78+
description="Github Plugin",
79+
url="http://localhost:8080",
80+
) as github_plugin:
81+
kernel = Kernel()
82+
kernel.add_plugin(github_plugin)
83+
# Do something with the kernel
84+
```
85+
86+
In both case the async context manager is used to setup the connection and close it, you can also do this manually:
87+
88+
```python
89+
import os
90+
from semantic_kernel import Kernel
91+
from semantic_kernel.connectors.mcp import MCPSsePlugin
92+
93+
async def main():
94+
plugin = MCPSsePlugin(
95+
name="Github",
96+
description="Github Plugin",
97+
url="http://localhost:8080",
98+
)
99+
await plugin.connect()
100+
kernel = Kernel()
101+
kernel.add_plugin(github_plugin)
102+
# Do something with the kernel
103+
await plugin.close()
104+
```
105+
106+
::: zone-end
107+
::: zone pivot="programming-language-csharp"
108+
109+
> [!NOTE]
110+
> MCP Documentation is coming soon for .Net.
111+
112+
::: zone-end
113+
::: zone pivot="programming-language-java"
114+
115+
> [!NOTE]
116+
> MCP Documentation is coming soon for Java.
117+
118+
::: zone-end

0 commit comments

Comments
 (0)