You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: README.md
+11-13
Original file line number
Diff line number
Diff line change
@@ -18,16 +18,21 @@ OpenUI let's you describe UI using your imagination, then see it rendered live.
18
18
19
19
## Running Locally
20
20
21
-
OpenUI supports [OpenAI](https://platform.openai.com/api-keys), [Groq](https://console.groq.com/keys), and any model [LiteLLM](https://docs.litellm.ai/docs/) supports such as [Gemini](https://aistudio.google.com/app/apikey) or [Anthropic (Claude)](https://console.anthropic.com/settings/keys). The following environment variables are optional, but need to be set in your environment for these services to work:
21
+
OpenUI supports [OpenAI](https://platform.openai.com/api-keys), [Groq](https://console.groq.com/keys), and any model [LiteLLM](https://docs.litellm.ai/docs/) supports such as [Gemini](https://aistudio.google.com/app/apikey) or [Anthropic (Claude)](https://console.anthropic.com/settings/keys). The following environment variables are optional, but need to be set in your environment for alternative models to work:
22
22
23
23
-**OpenAI**`OPENAI_API_KEY`
24
24
-**Groq**`GROQ_API_KEY`
25
25
-**Gemini**`GEMINI_API_KEY`
26
26
-**Anthropic**`ANTHROPIC_API_KEY`
27
27
-**Cohere**`COHERE_API_KEY`
28
28
-**Mistral**`MISTRAL_API_KEY`
29
+
-**OpenAI Compatible**`OPENAI_COMPATIBLE_ENDPOINT` and `OPENAI_COMPATIBLE_API_KEY`
29
30
30
-
You can also use models available to [Ollama](https://ollama.com). [Install Ollama](https://ollama.com/download) and pull a model like [Llava](https://ollama.com/library/llava). If Ollama is not running on http://127.0.0.1:11434, you can set the `OLLAMA_HOST` environment variable to the host and port of your Ollama instance.
31
+
For example, if you're running a tool like [localai](https://localai.io/) you can set `OPENAI_COMPATIBLE_ENDPOINT` and optionally `OPENAI_COMPATIBLE_API_KEY` to have the models available listed in the UI's model selector under LiteLLM.
32
+
33
+
### Ollama
34
+
35
+
You can also use models available to [Ollama](https://ollama.com). [Install Ollama](https://ollama.com/download) and pull a model like [Llava](https://ollama.com/library/llava). If Ollama is not running on http://127.0.0.1:11434, you can set the `OLLAMA_HOST` environment variable to the host and port of your Ollama instance. For example when running in docker you'll need to point to http://host.docker.internal:11434 as shown below.
31
36
32
37
### Docker (preferred)
33
38
@@ -43,23 +48,15 @@ Now you can goto [http://localhost:7878](http://localhost:7878) and generate new
43
48
44
49
### From Source / Python
45
50
46
-
Assuming you have git and python installed:
47
-
48
-
> **Note:** There's a .python-version file that specifies **openui** as the virtual env name. Assuming you have pyenv and pyenv-virtualenv you can run the following from the root of the repository or just run `pyenv local 3.X` where X is the version of python you have installed.
49
-
> ```bash
50
-
> pyenv virtualenv 3.12.2 openui
51
-
> pyenv local openui
52
-
>```
51
+
Assuming you have git and [uv](https://github.com/astral-sh/uv) installed:
53
52
54
53
```bash
55
54
git clone https://github.com/wandb/openui
56
55
cd openui/backend
57
-
# You probably want to do this from a virtual environment
58
-
pip install .
56
+
uv sync --frozen --extra litellm
57
+
source .venv/bin/activate
59
58
# Set API keys for any LLM's you want to use
60
59
export OPENAI_API_KEY=xxx
61
-
# You may change the base url to use an OpenAI-compatible api by setting the OPENAI_BASE_URL environment variable
0 commit comments