Terraphim AI is a privacy-first search assistant built in Rust. It runs on your machine, indexes your repositories, and never sends a byte to the cloud. No surveillance. No compromise.
Philosophy
Terraphim operates entirely on your hardware. Knowledge graphs are built from your documents using Aho-Corasick automata -- the same algorithms that power high-performance text matching in tools like ripgrep. Your semantic index stays local. Your search queries never leave your device.
Built by Applied Knowledge Systems Ltd, Terraphim is open-source Rust, auditable to the last byte. Every search, every graph traversal, every relevance score is computed locally.
Privacy is not a feature. It is the architecture. Terraphim Design Principles
Rolegraph constructs per-role knowledge graphs. Nodes represent concepts extracted from your documents. Edges encode semantic relationships. Thesaurus-driven expansion ensures recall without sacrificing precision.
Index anything. Local folders via ripgrep. Confluence and Jira through Atlassian connectors. Email via JMAP. Discourse forums. Each haystack is a pluggable data source with its own indexing strategy.
Aho-Corasick automata with LeftmostLongest matching provide sub-millisecond text search across thousands of terms. Compiled to WebAssembly for browser deployment. Autocomplete in under 5ms.
Choose your scorer: TitleScorer for quick matches, BM25Plus for statistical relevance, or TerraphimGraph for full semantic graph traversal. Each role can configure its own ranking strategy.
Untrusted operations execute inside Firecracker microVMs with sub-2-second boot times. Full VM sandboxing for code execution, web requests, and file operations. Security without latency.
29 crates. Zero runtime garbage collection. Async throughout with Tokio. Memory safety guaranteed at compile time. The entire system is auditable, forkable, and yours.
We built it in Rust because your knowledge graph should never segfault. Engineering Rationale
Terminal
Terraphim compiles to native binaries, WebAssembly, and runs inside Tauri for desktop deployment. The MCP server exposes every automata function as a tool for AI development environments. Local models via Ollama. Cloud models via OpenRouter. Your choice, always.
Deployment