TERRAPHIM

Measuring the tremors in your knowledge landscape

Explore Features View Source

Signal Detection

Every feature registers on the instrument. Clear signals from the noise.

Privacy-First Architecture

All processing happens locally. Your data stays on your machine, never transmitted. Zero seismic footprint on external networks.

Knowledge Graphs

Semantic relationships built from documents using Aho-Corasick automata. Concepts connect to form a rich, navigable lattice of meaning.

Multi-Source Search

Search across local files, Confluence, Jira, Discourse, email, and task managers. One query across every haystack, ranked by relevance.

Local LLM Integration

Run Ollama or other local models. Summarisation, chat, and intelligent descriptions without sending data off-device.

Firecracker Isolation

Sub-2-second microVM boot times. Execute untrusted code in full isolation. VM pooling for near-instant allocation.

MCP Protocol

Model Context Protocol server exposes autocomplete, text matching, thesaurus management, and graph connectivity as AI tools.

Live Activity

The knowledge graph is always recording. Every query, every connection, every insight.

Knowledge Graph Activity
29 Crates in workspace
7 Haystack integrations
5 Ranking algorithms
<2s VM boot time

Instrument Readings

Developer-ready. Read the trace directly.

# Build the instrument
cargo build --release

# Start recording
cargo run -- --config terraphim_engineer_config.json

# Open the terminal interface
./target/release/terraphim-agent

# Query the knowledge graph
/search "semantic search architecture"

# Chat with local LLM
/chat "explain the persistence layer"

Begin Recording

Your knowledge. Your instrument. Every signal captured, nothing lost.

View on GitHub Documentation