Terraphim AI runs entirely on your hardware. Semantic search across every knowledge source you use -- private, fast, and built in Rust.
Five relevance functions -- TitleScorer, BM25, BM25F, BM25Plus, and TerraphimGraph -- run concurrently. The best result surfaces regardless of the underlying algorithm.
Aho-Corasick automata match concepts using LeftmostLongest strategy. Build thesauri from documents or URLs and construct rolegraphs per user profile automatically.
Ripgrep, Confluence, Jira, Discourse, JMAP email, ClickUp, Logseq, Quickwit, and MCP -- all unified through a single middleware layer with consistent ranking.
Run untrusted code in fully sandboxed microVMs with sub-2-second boot times. The pooling system pre-warms instances for under 500ms allocation when you need it.
A modular workspace of 29 specialised crates covering search, persistence, agent orchestration, knowledge graphs, and haystack integrations. Each crate does one thing well.
Integrate Ollama for local inference or OpenRouter for cloud models -- your choice, your data. AI summarisation, chat, and document processing without mandatory cloud APIs.
Point Terraphim at your knowledge sources: local folders, Confluence wikis, email, or any supported connector. Everything stays local.
The automata engine extracts concepts, builds thesauri, and constructs role-specific graphs that understand your domain vocabulary.
Query once, search everywhere. Five ranking algorithms compete to surface the most relevant results in under 200 milliseconds.
Core search, document management, and AI integration layer
Knowledge graph with nodes, edges, and role-based relationships
Aho-Corasick text matching, autocomplete, and thesaurus building
Haystack indexing, document processing, and search orchestration
Multi-backend storage with transparent cache warm-up
Firecracker microVM integration for secure execution
Role-based configuration with multi-backend support
Terminal UI with hierarchical REPL commands