Setup
Embedder Configuration
Local
Ollama
Ollama Embedder

Ollama Embedder

️‼️

Heads up!

Ollama's /models endpoint will show both LLMs and Embedding models in the dropdown selection. Please ensure you are using an embedding model for embedding.

llama2 for example, is an LLM. Not an embedder.

Connecting to Ollama

When running ollama locally, you should connect to Ollama with http://127.0.0.1:11434 when using the default settings.

Ollama (opens in a new tab) supports the running of both LLMs and embedding models.

Please download the relevant embedding model you wish to use and select that during onboarding or in Settings to have your uploaded documents embed via Ollama.

You can update your model to a different model at any time in the Settings.

Ollama Embedder