Privacy-first AI with semantic search

Terraphim AI operates entirely on your device. Knowledge graphs, semantic embeddings, and local-first search -- your data never leaves your machine.

29+
Rust Crates
100%
Local Processing
<2s
VM Boot Time
0
Data Sent to Cloud

Everything runs locally

A modular Rust workspace designed from the ground up for privacy, performance, and extensibility.

Knowledge Graphs

Build structured relationships between concepts using custom graph-based semantic search with Aho-Corasick automata.

Privacy by Design

All processing happens on your device. No telemetry, no cloud dependency. Your knowledge stays yours.

Multi-Algorithm Search

Choose from TitleScorer, BM25, BM25F, BM25Plus, or TerraphimGraph relevance functions per role.

Haystack Integrations

Connect to local files, Confluence, Jira, Discourse, email via JMAP, and more through a unified search interface.

Firecracker Sandboxing

Execute untrusted code in microVMs with sub-2-second boot times and full isolation.

MCP Integration

Expose autocomplete, text processing, and graph tools to any AI development environment via Model Context Protocol.

Built with Rust, end to end

terraphim_service -- core orchestration
terraphim_middleware -- haystack indexing
terraphim_rolegraph -- knowledge graph
terraphim_automata -- text matching + WASM
terraphim_persistence -- multi-backend storage
terraphim_config -- role-based settings

29 crates, edition 2024, resolver v2

Modular by nature

Each concern lives in its own crate. Add a new search provider, relevance function, or haystack integration without touching the rest.

  • Async-first with Tokio runtime
  • WASM support for browser autocomplete
  • Multi-backend persistence layer
  • Role-based knowledge domains
  • Feature-gated LLM providers

Join the community

Terraphim AI is open source and built in the open. Contributions, issues, and ideas are always welcome.

Star on GitHub Join Discord