This agent is used to interact with a language model running locally or remotely by utilizing the Ollama API. Before using this agent locally you need to have Ollama installed and running.
- Install Ollama
- Install a Ollama model, we suggest using the
phi3
model as it is set as the default model in the code - Start the Ollama API server
To configure the agent, run /agent config ollama
to open up the setting file in your default editor, and then update the file based on the following example.
{
// To use Ollama API service:
// 1. Install Ollama: `winget install Ollama.Ollama`
// 2. Start Ollama API server: `ollama serve`
// 3. Install Ollama model: `ollama pull phi3`
// Declare Ollama model
"Model": "phi3",
// Declare Ollama endpoint
"Endpoint": "http://localhost:11434",
// Enable Ollama streaming
"Stream": false
}