TERRAPHIM AI

Privacy-first semantic search. Knowledge graphs on your device. Nothing leaves your machine.

Observations

Knowledge Graphs

Aho-Corasick automata build semantic graphs from your documents, enabling concept-level search across all your knowledge.

Privacy-First

Everything runs locally. No cloud dependency, no data exfiltration. Your knowledge stays on your device.

Multi-Source Search

Confluence, Jira, Discourse, email, local files, Quickwit -- one query, all your sources, ranked by relevance.

Role Profiles

Each role has its own knowledge graph, haystacks, and relevance function. Switch contexts instantly.

MCP Integration

Model Context Protocol server exposes autocomplete, text matching, and graph connectivity as AI tools.

Firecracker VMs

Secure command execution in microVMs. Sub-2-second boot. Sandbox untrusted code without overhead.

Desktop Application

Native app built with Svelte, TypeScript, and Tauri. Real-time search, graph visualisation, and config management.

LLM Support

OpenRouter and Ollama integration for summarisation, chat completion, and intelligent descriptions.

29 Crates
6 Haystacks
5 Relevance Fns
<2s VM Boot
200KB WASM

System Map

  • terraphim_service Main service layer -- search, document management, AI integration
  • terraphim_middleware Haystack indexing, document processing, search orchestration
  • terraphim_rolegraph Knowledge graph with node/edge relationships and thesaurus
  • terraphim_automata Text matching, autocomplete, thesaurus building, WASM support
  • terraphim_persistence Multi-backend storage with cache warm-up (memory, dashmap, SQLite, S3)
  • terraphim_mcp_server MCP server for AI tool integration (stdio, SSE, HTTP transports)
  • terraphim_tui Terminal UI with interactive REPL, search, chat, and VM management
  • haystack_* Atlassian, Discourse, JMAP, and core haystack abstractions

Mission Control

Observatory Terminal
# Clone and build
git clone https://github.com/terraphim/terraphim-ai
cd terraphim-ai
cargo build --release

# Start the server
cargo run --release -- --config terraphim_engineer_config.json

# Start the desktop frontend
cd desktop && yarn install && yarn dev

# Or launch the TUI
cargo build -p terraphim_tui --features repl-full --release
./target/release/terraphim-agent
MCP Server
# Start MCP for AI tool integration
cd crates/terraphim_mcp_server
./start_local_dev.sh

# Tools: autocomplete_terms, find_matches,
# replace_matches, load_thesaurus,
# is_all_terms_connected_by_path

Begin Observation

Point your telescope at the source. Privacy-first, local-first, yours.