Setup
LLM Configuration
Local
Local AI
Local AI LLM

Local AI LLM

LocalAI (opens in a new tab) is a popular open-source (opens in a new tab), API, and LLM engine that allows you to download and run any GGUF model from HuggingFace and run it on CPU or GPU.

LocalAI supports both LLMs, Embedding models, and image-generation models.

Connecting to Local AI

LocalAI is a Docker container image that you must configure and run.

You can update your model to a different model at any time in the Settings.

Local AI LLM settings