-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feat]Support alternative openai compatible llm providers #42
Comments
hey @VipinDevelops , Is this a valid issue to work on? Since it already supports openai, adding support for open ai compatible providers seems like a useful enhancement. |
@sushen123 Could you share doc's or something explaining more about what you are trying to achieve here? |
Right now, Quick Replies only works with OpenAI API, but many AI providers like Together AI, Groq, Fireworks AI, Anyscale, and Google Gemini follow a similar API structure. That means we could let users switch to a different AI provider just by adding a custom API URL field, without needing separate configurations. https://api.openai.com/v1/chat/completions
This way, Quick Replies will support multiple AI providers effortlessly! This way user can use choose more llm providers and also it make the ui clean Let me know if anything you want to talk about |
hey @VipinDevelops , is this valide issue to work on? |
What do we need?
Quick Replies currently supports OpenAI, but many openai compatible providers follow the same API format.
To allow flexibility, we need to add a single input field for an OpenAI-compatible API URL. This should work alongside the existing OpenAI API key and model fields without requiring separate configurations.
Acceptance Criteria
Relevant ScreenShots
No response
Further Comments
No response
The text was updated successfully, but these errors were encountered: