Measuring the tremors in your knowledge landscape
Every feature registers on the instrument. Clear signals from the noise.
All processing happens locally. Your data stays on your machine, never transmitted. Zero seismic footprint on external networks.
Semantic relationships built from documents using Aho-Corasick automata. Concepts connect to form a rich, navigable lattice of meaning.
Search across local files, Confluence, Jira, Discourse, email, and task managers. One query across every haystack, ranked by relevance.
Run Ollama or other local models. Summarisation, chat, and intelligent descriptions without sending data off-device.
Sub-2-second microVM boot times. Execute untrusted code in full isolation. VM pooling for near-instant allocation.
Model Context Protocol server exposes autocomplete, text matching, thesaurus management, and graph connectivity as AI tools.
The knowledge graph is always recording. Every query, every connection, every insight.
Developer-ready. Read the trace directly.
# Build the instrument
cargo build --release
# Start recording
cargo run -- --config terraphim_engineer_config.json
# Open the terminal interface
./target/release/terraphim-agent
# Query the knowledge graph
/search "semantic search architecture"
# Chat with local LLM
/chat "explain the persistence layer"
Your knowledge. Your instrument. Every signal captured, nothing lost.