Ollama Connection Troubleshooting
Ensure Ollama is Running
Before attempting any fixes or URL changes, verify that Ollama is running properly on your device:
- Open your web browser and navigate to
http://127.0.0.1:11434
- You should see a page similar to this:
If you don't see this page, troubleshoot your Ollama installation and ensure that it is running properly before moving forward.
Automatic URL Detection (LLM & Embedding Providers)
AnythingLLM features automatic URL detection for Ollama. Manual configuration is only necessary if auto-detection fails.
URL Successfully Detected When selecting the Ollama provider, AnythingLLM attempts
to auto detect your Ollama URL. If the option to input the base URL is hidden, the URL was automatically detected by AnythingLLM.
URL Detection Failed When manual endpoint input is expanded, the URL was not
able to be detected.
If Ollama was not started when AnythingLLM tried to detect the URL, start up Ollama
then press the Auto-Detect
button. This should automatically detect the URL and
allow you to begin selecting the Model
and Max Tokens
values. ## Setting the
Correct Ollama URL
If AnythingLLM was unable to detect your URL automatically, this is most likely an issue with your Ollama setup/configuration NOT AnythingLLM.
If you have confirmed 100% that your Ollama installation is running properly and is not being blocked by any firewalls etc, you can choose to set the URLs manually.
Choose your AnythingLLM version to find the correct Ollama URL:
Desktop Version
Use: http://127.0.0.1:11434
AnythingLLM Desktop: Built-in vs. Standalone Ollama
AnythingLLM Desktop offers two Ollama options:
-
Built-in AnythingLLM LLM Provider:
- Runs a separate Ollama instance internally.
- Models downloaded to standalone Ollama won't appear here.
-
Standalone Ollama:
- Run Ollama separately on your system.
- Use the URL
http://127.0.0.1:11434
.
Troubleshooting
If you're still experiencing issues:
- Confirm you're using the correct URL for your setup.
- Check for firewall or network issues blocking the connection.
- Restart both Ollama and AnythingLLM.
If problems persist after trying these steps, please contact visit our Discord to ask your questions.