Setup
LLM Configuration
Local
Ollama
Ollama LLM

Ollama LLM

Ollama (opens in a new tab) is a popular open-source (opens in a new tab) command-line tool and engine that allows you to download quantized versions of the most popular LLM chat models.

Ollama is a separate application that you need to download first and connect to. Ollama supports both running LLMs on CPU and GPU.

Connecting to Ollama

When running ollama locally, you should connect to Ollama with http://127.0.0.1:11434 when using the default settings.

You can update your model to a different model at any time in the Settings.

Ollama LLM settings