We released a hotfix for this version. Please ensure you are on the v1.8.5-r2 release instead.
File Chat Overhaul 🎉
When we first launched AnythingLLM the average local model context window was around 2K tokens. Now that local models are very powerful with 16K+ context windows it is time we overhaul our file UX.
Now, in AnythingLLM Desktop chatting with files is a breeze. When available we will now use the full file content to answer your questions when your model's context window is appropriate.
If you upload a file that is too large to fit in the context window, we will ask you to embed the file instead (RAG). If you want to have a file only for RAG, you can do that too via the regular file upload window on the workspace.
Now you can have the best of both worlds. Read more about this change here.
Improvements: 🚀
- Modal to clear embedding cache when you change the text splitter options so all files share the same splitting logic
- Moonshot AI LLM support
- The native embedder model can now easily be configured. Supports
nomic-embed-text-v1
(opens in a new tab) andmultilingual-e5-small
(opens in a new tab) now! - PostgresSQL now supports non-public schemas for tables.
- STT now appends spoken text in input instead of replacing it.
- Mobile Sync support for AnythingLLM Mobile Beta
- More translations including new Romanian translation
- New Agent EXA SERP provider (opens in a new tab)
- New Vector Database Chroma Cloud DB support (opens in a new tab)
Bug Fixes:
- Fixed YT and XLSX folder name bug where title had odd characters
- Fix multimodal chats for OpenAI Compatible API
- Fix issue where microphone tooltip was duplicated
- Fix issue with API chat export endpoint
- Fix issue with bedrock agents implied role
Pinned Download Links
Revision 1.8.5-r2:
- Mac (x86_64) Download (opens in a new tab)
- Mac (Apple Silicon) Download (opens in a new tab)
- Windows Download (opens in a new tab)
- Windows (ARM) Download (opens in a new tab)
- Linux (x86_64) Download (opens in a new tab)