Skip to content

Latest commit

 

History

History
48 lines (37 loc) · 2.63 KB

configure.md

File metadata and controls

48 lines (37 loc) · 2.63 KB
title description sidebar_position
Configure CodeGate
Customizing CodeGate's application settings
20

Customize CodeGate's behavior

The CodeGate container runs with default settings to support Ollama, Anthropic, and OpenAI APIs with typical settings. To customize the behavior, you can supply extra configuration parameters to the container as environment variables:

docker run --name codegate -d -p 8989:8989 -p 9090:9090 \
  [-e KEY=VALUE ...] \
  --restart unless-stopped ghcr.io/stacklok/codegate

Config parameters

CodeGate supports the following parameters:

Parameter Default value Description
CODEGATE_OLLAMA_URL http://host.docker.internal:11434 Specifies the URL of an Ollama instance. Used when the provider in your plugin config is ollama.
CODEGATE_VLLM_URL http://localhost:8000 Specifies the URL of a model hosted by a vLLM server. Used when the provider in your plugin config is vllm.
CODEGATE_ANTHROPIC_URL https://api.anthropic.com/v1 Specifies the Anthropic engine API endpoint URL.
CODEGATE_OPENAI_URL https://api.openai.com/v1 Specifies the OpenAI engine API endpoint URL.
CODEGATE_APP_LOG_LEVEL WARNING Sets the logging level. Valid values: ERROR, WARNING, INFO, DEBUG (case sensitive)
CODEGATE_LOG_FORMAT TEXT Type of log formatting. Valid values: TEXT, JSON (case sensitive)

Example: Use CodeGate with OpenRouter

OpenRouter is an interface to many large language models. CodeGate's vLLM provider works with OpenRouter's API when used with the Continue IDE plugin.

To use OpenRouter, set the vLLM URL when you launch CodeGate:

docker run --name codegate -d -p 8989:8989 -p 9090:9090 \
  -e CODEGATE_VLLM_URL=https://openrouter.ai/api \
  --restart unless-stopped ghcr.io/stacklok/codegate

Then, configure the Continue IDE plugin to access the vLLM endpoint (http://localhost:8989/vllm/) along with the model you'd like to use and your OpenRouter API key.