Skip to content

How to perform function/tool calling using non-OpenAI models. #1202

Answered by ekzhu
niuhuluzhihao asked this question in Q&A
Discussion options

You must be logged in to vote

If the LLM at least support generating structured response i.e., a JSON object corresponding to the remote API service call, then you should be able to prompt it to generate that given a natural language task. This is for the Assistant Agent.

For the UserProxyAgent, you can register a new reply function like what you mentioned, and, in that function, you parse the JSON object and make the actual call to the remote API service, and then pass the response back to the Assistant Agent.

Then you call user_proxy_agent.initiate_chat(assistant_agent, "<your task description>") to start the conversation.

The hardest part in my opinion is the getting the system prompt right for the assistant agent …

Replies: 2 comments 4 replies

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
4 replies
@ekzhu
Comment options

Answer selected by ekzhu
@niuhuluzhihao
Comment options

@ekzhu
Comment options

@niuhuluzhihao
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
tool-usage suggestion and execution of function/tool call
2 participants
Converted from issue

This discussion was converted from issue #1192 on January 11, 2024 03:21.