TERRAPHIM

Privacy-First AI Assistant

Semantic search across your knowledge using local-first knowledge graphs. Your data stays on your hardware. Always.

Get Started Learn More

Equipment Rack

Each module in the Terraphim stack is purpose-built for privacy and performance.

Local-First Processing

Every query, every index operation, every AI interaction happens entirely on your machine. Zero cloud dependency.

Knowledge Graphs

Aho-Corasick automata build semantic connections between concepts, delivering context-aware search results.

Rust-Powered Speed

Sub-millisecond text matching. Concurrent async pipelines. Memory-safe from the ground up with zero overhead.

Multi-Source Haystacks

Local files, Confluence, Jira, email via JMAP, Discourse, and Quickwit. Unified search, zero silos.

Role-Based Context

Switch between engineering, operations, and custom roles. Each role brings its own relevance tuning and graph.

Firecracker VMs

Execute untrusted code in isolated microVMs with sub-2-second boot times. Security without compromise.

Signal Path

From raw data to intelligent answers, follow the signal through the mixer.

Ch 1
Ingest Connect your data sources. Local files, APIs, or knowledge bases.
Ch 2
Index Build automata and knowledge graphs from your content.
Ch 3
Search BM25, TitleScorer, or TerraphimGraph -- choose your relevance function.
Ch 4
Deliver Results ranked by semantic meaning, served via API or desktop UI.

System Monitor

Quick start: from zero to search in four commands.

TERRAPHIM SYSTEM CONSOLE
// Clone and build from source
$ git clone https://github.com/terraphim/terraphim-ai
$ cd terraphim-ai && cargo build --release
 
// Start the server with your role config
$ cargo run -- --config terraphim_engineer_config.json
 
// Or launch the interactive agent
$ ./target/release/terraphim-agent

Press Play

Take control of your knowledge. Terraphim AI runs where you do.

View on GitHub Documentation