Skip to content

Commit

Permalink
feat: add Semantic Kernel Adapter documentation and usage examples in…
Browse files Browse the repository at this point in the history
… user guides (#5256)

Partially address #5205 and #5226
  • Loading branch information
ekzhu authored Jan 30, 2025
1 parent 7020f2a commit 403844e
Show file tree
Hide file tree
Showing 3 changed files with 330 additions and 107 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -327,6 +327,101 @@
"response = await model_client.create([UserMessage(content=\"What is the capital of France?\", source=\"user\")])\n",
"print(response)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Semantic Kernel Adapter\n",
"\n",
"The {py:class}`~autogen_ext.models.semantic_kernel.SKChatCompletionAdapter`\n",
"allows you to use Semantic kernel model clients as a\n",
"{py:class}`~autogen_core.models.ChatCompletionClient` by adapting them to the required interface.\n",
"\n",
"You need to install the relevant provider extras to use this adapter. \n",
"\n",
"The list of extras that can be installed:\n",
"\n",
"- `semantic-kernel-anthropic`: Install this extra to use Anthropic models.\n",
"- `semantic-kernel-google`: Install this extra to use Google Gemini models.\n",
"- `semantic-kernel-ollama`: Install this extra to use Ollama models.\n",
"- `semantic-kernel-mistralai`: Install this extra to use MistralAI models.\n",
"- `semantic-kernel-aws`: Install this extra to use AWS models.\n",
"- `semantic-kernel-hugging-face`: Install this extra to use Hugging Face models.\n",
"\n",
"For example, to use Anthropic models, you need to install `semantic-kernel-anthropic`."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"vscode": {
"languageId": "shellscript"
}
},
"outputs": [],
"source": [
"# pip install \"autogen-ext[semantic-kernel-anthropic]\""
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"To use this adapter, you need create a Semantic Kernel model client and pass it to the adapter.\n",
"\n",
"For example, to use the Anthropic model:"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"finish_reason='stop' content='The capital of France is Paris. It is also the largest city in France and one of the most populous metropolitan areas in Europe.' usage=RequestUsage(prompt_tokens=0, completion_tokens=0) cached=False logprobs=None\n"
]
}
],
"source": [
"import os\n",
"\n",
"from autogen_core.models import UserMessage\n",
"from autogen_ext.models.semantic_kernel import SKChatCompletionAdapter\n",
"from semantic_kernel import Kernel\n",
"from semantic_kernel.connectors.ai.anthropic import AnthropicChatCompletion, AnthropicChatPromptExecutionSettings\n",
"from semantic_kernel.memory.null_memory import NullMemory\n",
"\n",
"sk_client = AnthropicChatCompletion(\n",
" ai_model_id=\"claude-3-5-sonnet-20241022\",\n",
" api_key=os.environ[\"ANTHROPIC_API_KEY\"],\n",
" service_id=\"my-service-id\", # Optional; for targeting specific services within Semantic Kernel\n",
")\n",
"settings = AnthropicChatPromptExecutionSettings(\n",
" temperature=0.2,\n",
")\n",
"\n",
"anthropic_model_client = SKChatCompletionAdapter(\n",
" sk_client, kernel=Kernel(memory=NullMemory()), prompt_settings=settings\n",
")\n",
"\n",
"# Call the model directly.\n",
"model_result = await anthropic_model_client.create(\n",
" messages=[UserMessage(content=\"What is the capital of France?\", source=\"User\")]\n",
")\n",
"print(model_result)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Read more about the [Semantic Kernel Adapter](../../../reference/python/autogen_ext.models.semantic_kernel.rst)."
]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -336,6 +336,101 @@
"print(response)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Semantic Kernel Adapter\n",
"\n",
"The {py:class}`~autogen_ext.models.semantic_kernel.SKChatCompletionAdapter`\n",
"allows you to use Semantic kernel model clients as a\n",
"{py:class}`~autogen_core.models.ChatCompletionClient` by adapting them to the required interface.\n",
"\n",
"You need to install the relevant provider extras to use this adapter. \n",
"\n",
"The list of extras that can be installed:\n",
"\n",
"- `semantic-kernel-anthropic`: Install this extra to use Anthropic models.\n",
"- `semantic-kernel-google`: Install this extra to use Google Gemini models.\n",
"- `semantic-kernel-ollama`: Install this extra to use Ollama models.\n",
"- `semantic-kernel-mistralai`: Install this extra to use MistralAI models.\n",
"- `semantic-kernel-aws`: Install this extra to use AWS models.\n",
"- `semantic-kernel-hugging-face`: Install this extra to use Hugging Face models.\n",
"\n",
"For example, to use Anthropic models, you need to install `semantic-kernel-anthropic`."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"vscode": {
"languageId": "shellscript"
}
},
"outputs": [],
"source": [
"# pip install \"autogen-ext[semantic-kernel-anthropic]\""
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"To use this adapter, you need create a Semantic Kernel model client and pass it to the adapter.\n",
"\n",
"For example, to use the Anthropic model:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"finish_reason='stop' content='The capital of France is Paris. It is also the largest city in France and one of the most populous metropolitan areas in Europe.' usage=RequestUsage(prompt_tokens=0, completion_tokens=0) cached=False logprobs=None\n"
]
}
],
"source": [
"import os\n",
"\n",
"from autogen_core.models import UserMessage\n",
"from autogen_ext.models.semantic_kernel import SKChatCompletionAdapter\n",
"from semantic_kernel import Kernel\n",
"from semantic_kernel.connectors.ai.anthropic import AnthropicChatCompletion, AnthropicChatPromptExecutionSettings\n",
"from semantic_kernel.memory.null_memory import NullMemory\n",
"\n",
"sk_client = AnthropicChatCompletion(\n",
" ai_model_id=\"claude-3-5-sonnet-20241022\",\n",
" api_key=os.environ[\"ANTHROPIC_API_KEY\"],\n",
" service_id=\"my-service-id\", # Optional; for targeting specific services within Semantic Kernel\n",
")\n",
"settings = AnthropicChatPromptExecutionSettings(\n",
" temperature=0.2,\n",
")\n",
"\n",
"anthropic_model_client = SKChatCompletionAdapter(\n",
" sk_client, kernel=Kernel(memory=NullMemory()), prompt_settings=settings\n",
")\n",
"\n",
"# Call the model directly.\n",
"model_result = await anthropic_model_client.create(\n",
" messages=[UserMessage(content=\"What is the capital of France?\", source=\"User\")]\n",
")\n",
"print(model_result)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Read more about the [Semantic Kernel Adapter](../../../reference/python/autogen_ext.models.semantic_kernel.rst)."
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down
Loading

0 comments on commit 403844e

Please sign in to comment.