AnythingLLM Mobile
Introduction

Introduction

AnythingLLM Mobile is a mobile app that brings the entire AnythingLLM experience onto your phone.

It is currently only available for Android for now. The app is available in the Google Play Store and can be downloaded directly from the AnythingLLM Mobile website (opens in a new tab).

Features

  • Chat with local SLM - Chat with your local SLM (small language model) on your phone. Supports both reasoning and non-reasoning models.
  • Change models on the fly - Easily swap between different models
  • Workspace and Threads - Create workspaces and threads to organize your chats
  • On device RAG - Locally process your documents and use them in your chats all fully offline
  • Agentic Tools - Leverage the power of AnythingLLM's agentic tools like web search, web scraping, deep research, and even cross app interactions like drafting emails or managing your calendar
  • Sync with AnythingLLM Desktop & Cloud - Sync your chats, workspaces, and threads with AnythingLLM Desktop or AnythingLLM Cloud/Self-hosted instances

If you have any general questions, please join the #anythingllm-mobile channel in the AnythingLLM Discord (opens in a new tab) and we'll help you out.

Feedback Reporting

All feedback should be officially reported via the AnythingLLM Discord (opens in a new tab) in the #anythingllm-mobile channel.

Public Issue Tracking

All public issues should be reported via the AnythingLLM Mobile Beta Issue Tracker (opens in a new tab).

Common Questions

IOS support?

We are planning to support iOS in the future. Currently, we are focusing on Android for a full release by the end of September 2025. iOS support coming after that in October 2025.

Can I download any model I want?

Right now, for performance reasons, we only support a hand-picked models. Eventually we will support any model you want, but for now, we are focusing on performance and stability.

How does syncing with AnythingLLM Desktop & Cloud work?

️💡

Requires version 1.8.5 or higher of AnythingLLM Desktop or AnythingLLM Cloud/Self-hosted instances. In 1.8.5, this feature is hidden behind the "Experimental features" sidebar item.

To enable experimental features, you need to follow the instructions here.

AnythingLLM Mobile while functional and complete standalone, is designed to be also be a companion to AnythingLLM Desktop and AnythingLLM Cloud.

Since mobile devices have limited resources, we now have the ability to sync your chats, workspaces, and threads with AnythingLLM Desktop or AnythingLLM Cloud/Self-hosted instances in addition to being able to delegate inference across your local network or cloud instances for more compute and more powerful models, but in the mobile form factor!

This technology is called Distributed InferenceTM and is a key part of AnythingLLM's vision for the future of local AI.

How does the on device RAG work?

AnythingLLM Mobile runs a small embedding model + local vector database on your device to provide RAG capabilities with citations.

How can I add my own agent tools?

Currently, to use custom agent tools, MCPs or otherwise, you should use the sync feature with AnythingLLM Desktop or AnythingLLM Cloud. Customization of agent tools on mobile standalone is not yet supported.