Skip to content

Latest commit

 

History

History
30 lines (23 loc) · 1.12 KB

File metadata and controls

30 lines (23 loc) · 1.12 KB

Ollama Plugin

This agent is used to interact with a language model running locally or remotely by utilizing the Ollama API. Before using this agent locally you need to have Ollama installed and running.

Pre-requisites to using the agent locally

Configuration

To configure the agent, run /agent config ollama to open up the setting file in your default editor, and then update the file based on the following example.

{
    // To use Ollama API service:
    // 1. Install Ollama: `winget install Ollama.Ollama`
    // 2. Start Ollama API server: `ollama serve`
    // 3. Install Ollama model: `ollama pull phi3`

    // Declare Ollama model
    "Model": "phi3",
    // Declare Ollama endpoint
    "Endpoint": "http://localhost:11434",
    // Enable Ollama streaming
    "Stream": false
}