-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for anthropic models in v0.4 #5205
Comments
Thanks @raagul-qb. I am editing this Issue to remove the RAG part to help making this issue more specific. @rohanthacker was that you or someone else posted on Reddit about working on an Anthropic client? |
For RAG, please see rag tools like graphrag: https://microsoft.github.io/autogen/stable/reference/python/autogen_ext.tools.graphrag.html#autogen_ext.tools.graphrag.LocalSearchTool, checkout samples too. https://github.com/microsoft/autogen/tree/main/python/samples/agentchat_graphrag For RAG agent, use this #4742 to track |
Yes that was me. I've built out initial versions of the gemini and anthropic clients. |
Thanks! I have been using the Azure AI client now. Thanks for your contribution. Would love to review the PR for Gemini and Anthropic clients. |
+1 I want to use it using the OpenAI compatiblity but unable to do so.
|
I'm back on this project again this week. I was a bit tied up with work and the end of the year activities. I'll push up a draft soon |
@Navanit-git For an immediate alternative you could try using the new Refer these docs for more info: |
Bumping this—just adding my voice to say this is a valuable feature, and it's unfortunate that 0.4 was released with only OpenAI support (unless I missed something). Personally, I need Anthropic support badly enough that I'll have to go with a different framework for now. Autogen 0.4 looks great and fits what I need, but without Anthropic support—and without time to implement it myself—I’ll likely stick with LangGraph and end up reinventing what AgentChat already provides. |
@ai-is-here This feature is work in progress. @ekzhu Can this task be assigned to me I'm working on this. |
@rohanthacker thank you! @ai-is-here , see the following example code: import asyncio
import os
from semantic_kernel import Kernel
from semantic_kernel.memory.null_memory import NullMemory
from semantic_kernel.connectors.ai.anthropic import AnthropicChatCompletion
from autogen_ext.models.semantic_kernel import SKChatCompletionAdapter
from autogen_agentchat.agents import AssistantAgent
async def main() -> None:
sk_anthropic_client = AnthropicChatCompletion(
ai_model_id="claude-3-5-sonnet-20241022",
api_key=os.environ["ANTHROPIC_API_KEY"],
service_id="my-service-id", # Optional; for targeting specific services within Semantic Kernel
)
model_client = SKChatCompletionAdapter(sk_anthropic_client, kernel=Kernel(memory=NullMemory()))
assistant = AssistantAgent("assistant", model_client=model_client)
result = await assistant.run(task="What is the capital of France?")
print(result)
asyncio.run(main()) It should just work. |
What feature would you like to be added?
Currently there is no way to use anthropic models with autogen 0.4 I need a way to use Claude models just like I was able to do so in 0.2.
Also there is no support for performing RAG as of now
Why is this needed?
Claude models are better at planning and deliberating in multiagent environments.
RAG support makes the the agent capable of handling a variety of tasks
The text was updated successfully, but these errors were encountered: