Can I customize the prompt used for tool-result steps? #4945
Unanswered
jonathanbetz
asked this question in
Help
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm experimenting with the AI SDK via the Next.js Chatbot Template, and I'm particularly interested in tool calling output.
I've built a tool which successfully returns json, and I've added a new UI component to render the data appropriately.
However, if I'm reading @stream-text.ts correctly, streamtext() is taking the tool result json and passing that back to the llm in a 'tool-result' step, which re-uses the system prompt along with the tool results. This leads to the llm generating a message which nearly duplicates the information presented in the UI component.
You can recreate this yourself in the Chatbot demo by asking "What's the weather in ?"; you'll get a weather component, accompanied by a prose description of the same information. Ideally, I could either:
(a) supply a tool-specific prompt for how the LLM should form its description of the data provided by the tool
(b) indicate that this is a tool which does not require additional text beyond the tool response
I could add some jank in message rendering which checks if tool results were populated, and drop the last message from the llm in that case, but I'm looking for something more elegant.
Is there a better way to configure how streamtext() makes use of tool result data?
Beta Was this translation helpful? Give feedback.
All reactions