-
Notifications
You must be signed in to change notification settings - Fork 150
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for Open models through VLLM #145
Comments
You can use litellm to create an openAI API (from other models) and you should just be able to provide litellm credentials instead of OpenAI credentials. Just make sure that it also has an embedding model configured. |
HI @prasmussen15 , Thank you so much for your recommendation. I was able to put it to the test, but with no good results. After sending the first completion, the system seems to fail. It has also caused my own LLM server to go down; I host it using VLLM. Is there any approach you can come up with?
|
Hi,
I wonder if you support other model providers that could be store locally. Specially, i do not know how it interacts with other services as you use embeddings for the tool.
Thanks
The text was updated successfully, but these errors were encountered: