How to stream results back with using an LLM api directly? #4724
Replies: 1 comment
-
I'm looking into a similar thing. I'd love to be able to configure the google vertex model to ground to a source other than Google Search. In this case I have created a datasource that includes documents in a Google Drive folder, but in the future I'd like other sources as well. Is this something that can be configured using the VercelAI SDK, the comment from @Godrules500 suggests that this is not possible, yet. Looking at the nodejs-vertexai readme.md it seems like this is possible through the API / Libraries that Google has released. Can this be done today (point me to the docs)? |
Beta Was this translation helpful? Give feedback.
-
Since the vercel ai sdk does not have the ability yet to handle other types of grounding provided by vertex ai, I am going to call the Vertex AI api directly. With that, I want to still stream the response back to the front end. How can I do this with the tools provided via the ai sdk?
So the workflow would be:
user asks a question --> pre processing determines that we have a vector db for this question --> call vertex ai api directly --> stream the response back to the ai sdk UI.
Beta Was this translation helpful? Give feedback.
All reactions