Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for anthropic models in v0.4 #5205

Open
raagul-qb opened this issue Jan 26, 2025 · 10 comments
Open

Support for anthropic models in v0.4 #5205

raagul-qb opened this issue Jan 26, 2025 · 10 comments
Assignees
Milestone

Comments

@raagul-qb
Copy link

What feature would you like to be added?

Currently there is no way to use anthropic models with autogen 0.4 I need a way to use Claude models just like I was able to do so in 0.2.

Also there is no support for performing RAG as of now

Why is this needed?

Claude models are better at planning and deliberating in multiagent environments.

RAG support makes the the agent capable of handling a variety of tasks

@ekzhu
Copy link
Collaborator

ekzhu commented Jan 26, 2025

Thanks @raagul-qb. I am editing this Issue to remove the RAG part to help making this issue more specific.

@rohanthacker was that you or someone else posted on Reddit about working on an Anthropic client?

@ekzhu ekzhu changed the title Support for anthropic models + RAG Support for anthropic models in v0.4 Jan 26, 2025
@ekzhu
Copy link
Collaborator

ekzhu commented Jan 26, 2025

@rohanthacker
Copy link
Contributor

Yes that was me. I've built out initial versions of the gemini and anthropic clients.

@ekzhu
Copy link
Collaborator

ekzhu commented Jan 26, 2025

Thanks! I have been using the Azure AI client now. Thanks for your contribution.

Would love to review the PR for Gemini and Anthropic clients.

@ekzhu ekzhu added this to the 0.4.x milestone Jan 26, 2025
@Navanit-git
Copy link

+1 I want to use it using the OpenAI compatiblity but unable to do so.
Is there any alternative

model_client = OpenAIChatCompletionClient(
    model="claude-3-5-sonnet-20241022",
    api_key="sxxx",
    base_url="https://api.anthropic.com/v1/messages",
    model_capabilities={
        "vision": False,
        "function_calling": True,
        "json_output": False,
        'tool_calls' : True
    }
)

@rohanthacker
Copy link
Contributor

Thanks! I have been using the Azure AI client now. Thanks for your contribution.

Would love to review the PR for Gemini and Anthropic clients.

I'm back on this project again this week. I was a bit tied up with work and the end of the year activities. I'll push up a draft soon

@rohanthacker
Copy link
Contributor

rohanthacker commented Jan 28, 2025

+1 I want to use it using the OpenAI compatiblity but unable to do so. Is there any alternative

model_client = OpenAIChatCompletionClient(
    model="claude-3-5-sonnet-20241022",
    api_key="sxxx",
    base_url="https://api.anthropic.com/v1/messages",
    model_capabilities={
        "vision": False,
        "function_calling": True,
        "json_output": False,
        'tool_calls' : True
    }
)

@Navanit-git
I could be wrong, but I think this will work with the OpenAIChatCompletionClient as the API's and client libraries for these services are not cross-compatible out of the box.

For an immediate alternative you could try using the new SKChatCompletionAdapter along with the AnthropicChatCompletion from the sematic kernel package.

Refer these docs for more info:
SKChatCompletionAdapter from Autogen
AnthropicChatCompletion from Semantic Kernel

@ai-is-here
Copy link

Bumping this—just adding my voice to say this is a valuable feature, and it's unfortunate that 0.4 was released with only OpenAI support (unless I missed something). Personally, I need Anthropic support badly enough that I'll have to go with a different framework for now. Autogen 0.4 looks great and fits what I need, but without Anthropic support—and without time to implement it myself—I’ll likely stick with LangGraph and end up reinventing what AgentChat already provides.

@rohanthacker
Copy link
Contributor

@ai-is-here This feature is work in progress.
For immediate support without having to implement and with minimal code you could try the Sematic Kernel adapter mentioned above.

@ekzhu Can this task be assigned to me I'm working on this.

@ekzhu
Copy link
Collaborator

ekzhu commented Jan 29, 2025

@rohanthacker thank you!

@ai-is-here , see the following example code:

import asyncio
import os
from semantic_kernel import Kernel
from semantic_kernel.memory.null_memory import NullMemory
from semantic_kernel.connectors.ai.anthropic import AnthropicChatCompletion
from autogen_ext.models.semantic_kernel import SKChatCompletionAdapter
from autogen_agentchat.agents import AssistantAgent

async def main() -> None:
    sk_anthropic_client = AnthropicChatCompletion(
        ai_model_id="claude-3-5-sonnet-20241022",
        api_key=os.environ["ANTHROPIC_API_KEY"],
        service_id="my-service-id", # Optional; for targeting specific services within Semantic Kernel
    )

    model_client = SKChatCompletionAdapter(sk_anthropic_client, kernel=Kernel(memory=NullMemory()))

    assistant = AssistantAgent("assistant", model_client=model_client)

    result = await assistant.run(task="What is the capital of France?")
    print(result)

asyncio.run(main())

It should just work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants