From Zero to Persistent Memory
OMEGA is an MCP server, not a background AI reading your conversations. Your client calls tools to store and retrieve memories. Here's how it works and how to set it up.
How It Works
One process, four layers. Everything runs locally on your machine.
OMEGA doesn't read your conversations. It's a tool provider. Your AI client decides when to call omega_store() or omega_query(), guided by protocol instructions loaded at session start.
Claude Code's hook system fires shell commands on events like session start and end. OMEGA hooks inject context (prior decisions, handoff summaries) so the AI has relevant history without scanning anything.
All memories live in a SQLite database on your machine. Vector embeddings (bge-small-en-v1.5) run locally via ONNX. No cloud, no API keys, no data leaves your device.
The Auto-Capture Loop
The “auto” in auto-capture comes from two things: hooks that fire on session events, and protocol instructions that teach the AI when and what to persist. The LLM is the intelligence layer. OMEGA is the persistence and retrieval layer.
A hook fires omega_welcome(), which returns recent memories, pending tasks, handoff summaries, and your user profile.
omega_protocol() returns operating instructions that tell the AI when to store memories, what types to use, and how to query for prior context.
The AI calls omega_store() for decisions, lessons, and error patterns. It calls omega_query() when it needs prior context. You can also say "remember this" to store anything explicitly.
A hook saves a session summary. The next session picks up where this one left off, with full context of what happened before.
Get Started
Three commands. Under two minutes. Works with Claude Code out of the box.
Using Cursor, Windsurf, or Zed? Pass --client cursor, --client windsurf, or --client zed. See the full docs for details.
Try It Out
Open Claude Code in any project. OMEGA loads automatically.
Ready to remember?
12 MCP tools. Semantic search. Intelligent forgetting. All local.