How to perform function/tool calling using non-OpenAI models. #1202
-
Assuming that I customize the tool function, must my model have the ability to tool calling? If my model does not have the tool_calling capability, how should I embed the tool into my custom agent? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 4 replies
-
Currently for tool calling it requires LLM support. But you can use code generation capability of LLM. You would need to customize the system_message to include the definition of the function. |
Beta Was this translation helpful? Give feedback.
-
Thank you very much for your answer. |
Beta Was this translation helpful? Give feedback.
If the LLM at least support generating structured response i.e., a JSON object corresponding to the remote API service call, then you should be able to prompt it to generate that given a natural language task. This is for the Assistant Agent.
For the UserProxyAgent, you can register a new reply function like what you mentioned, and, in that function, you parse the JSON object and make the actual call to the remote API service, and then pass the response back to the Assistant Agent.
Then you call
user_proxy_agent.initiate_chat(assistant_agent, "<your task description>")
to start the conversation.The hardest part in my opinion is the getting the system prompt right for the assistant agent …