Local-First Design
Xio runs entirely on your machine. All processing, storage, and LLM inference can be handled locally using open-source tools. You are not required to connect to any third-party services unless you choose to.
You can configure Xio to use local models for both embeddings and language generation. It supports Chroma as the default vector database, and integrations like whisper for transcription.
Here is a sample .env
configuration for a fully offline setup:
envCopyEditLLM_PROVIDER=local
EMBEDDING_PROVIDER=local
VECTOR_DB=chroma
DISABLE_TELEMETRY=true
With this setup, all data stays within your local environment. Xio is designed to be quiet, minimal, and fully under your control. Whether you're using it for personal research, technical projects, or managing internal documents, it adapts to your workflow without interference.
Last updated