Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feat]Support alternative openai compatible llm providers #42

Open
4 tasks
sushen123 opened this issue Feb 26, 2025 · 4 comments
Open
4 tasks

[Feat]Support alternative openai compatible llm providers #42

sushen123 opened this issue Feb 26, 2025 · 4 comments
Labels
enhancement New feature or request

Comments

@sushen123
Copy link
Contributor

What do we need?

Quick Replies currently supports OpenAI, but many openai compatible providers follow the same API format.
To allow flexibility, we need to add a single input field for an OpenAI-compatible API URL. This should work alongside the existing OpenAI API key and model fields without requiring separate configurations.

Acceptance Criteria

  • Add a single input field for an open ai compatible api url.
  • The existing Openai api key and model fields should be used for openai compatible providers.
  • If a custom api url is provided, requests should be directed to it instead of the default OpenAI API.
  • Quick Replies should function correctly with alternative providers following OpenAI’s API schema.

Relevant ScreenShots

No response

Further Comments

No response

@sushen123 sushen123 added the enhancement New feature or request label Feb 26, 2025
@sushen123
Copy link
Contributor Author

hey @VipinDevelops , Is this a valid issue to work on? Since it already supports openai, adding support for open ai compatible providers seems like a useful enhancement.

@VipinDevelops
Copy link
Collaborator

VipinDevelops commented Feb 26, 2025

@sushen123 Could you share doc's or something explaining more about what you are trying to achieve here?
Not able to get your point here.

@sushen123
Copy link
Contributor Author

sushen123 commented Feb 27, 2025

@sushen123 Could you share doc's or something explaining more about what you are trying to achieve here? Not able to get your point here.

Right now, Quick Replies only works with OpenAI API, but many AI providers like Together AI, Groq, Fireworks AI, Anyscale, and Google Gemini follow a similar API structure.

That means we could let users switch to a different AI provider just by adding a custom API URL field, without needing separate configurations.

https://api.openai.com/v1/chat/completions
If a user wants to switch to Together AI, they could enter:
https://api.together.xyz/v1/chat/completions in API URL
For Google Gemini, they could enter:
https://generativelanguage.googleapis.com/v1beta/chat/completions

  • No extra fields needed for API key or model the existing OpenAI fields will be used.
  • Only one additional field for the API base URL.
  • No separate Gemini input field is needed it can also use the OpenAI compatible setup.

This way, Quick Replies will support multiple AI providers effortlessly!
The UI will also be clean for the user.
Gemini docs: https://ai.google.dev/gemini-api/docs/openai#rest_2
together ai docs: https://docs.together.ai/reference/chat-completions-1
ollama : https://ollama.com/blog/openai-compatibility

This way user can use choose more llm providers and also it make the ui clean

Image

Let me know if anything you want to talk about

@sushen123
Copy link
Contributor Author

hey @VipinDevelops , is this valide issue to work on?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants