Skip to main content

From Zero to Persistent Memory

OMEGA is an MCP server, not a background AI reading your conversations. Your client calls tools to store and retrieve memories. Here's how it works and how to set it up.

How It Works

One process, four layers. Everything runs locally on your machine.

Your AI Client
Claude Code, Cursor, Windsurf
MCP Protocol
stdio / JSON-RPC
OMEGA Server
Python process
Local Storage
SQLite + Vectors
No Background AI

OMEGA doesn't read your conversations. It's a tool provider. Your AI client decides when to call omega_store() or omega_query(), guided by protocol instructions loaded at session start.

Hooks, Not Surveillance

Claude Code's hook system fires shell commands on events like session start and end. OMEGA hooks inject context (prior decisions, handoff summaries) so the AI has relevant history without scanning anything.

Local-First Storage

All memories live in a SQLite database on your machine. Vector embeddings (bge-small-en-v1.5) run locally via ONNX. No cloud, no API keys, no data leaves your device.

The Auto-Capture Loop

The “auto” in auto-capture comes from two things: hooks that fire on session events, and protocol instructions that teach the AI when and what to persist. The LLM is the intelligence layer. OMEGA is the persistence and retrieval layer.

1
Session Starts

A hook fires omega_welcome(), which returns recent memories, pending tasks, handoff summaries, and your user profile.

2
Protocol Loads

omega_protocol() returns operating instructions that tell the AI when to store memories, what types to use, and how to query for prior context.

3
During Work

The AI calls omega_store() for decisions, lessons, and error patterns. It calls omega_query() when it needs prior context. You can also say "remember this" to store anything explicitly.

4
Session Ends

A hook saves a session summary. The next session picks up where this one left off, with full context of what happened before.

Get Started

Three commands. Under two minutes. Works with Claude Code out of the box.

1Install
install
$pip install omega-memory
Installed omega-memory 0.9.3
2Configure
configure
$omega setup
# Registers MCP server with Claude Code
# Installs session hooks for auto-capture
# Creates bootstrap CLAUDE.md instructions
MCP server registered
Hooks installed
CLAUDE.md bootstrap written

Using Cursor, Windsurf, or Zed? Pass --client cursor, --client windsurf, or --client zed. See the full docs for details.

3Verify
verify
$omega doctor
Python 3.11+ detected
MCP server registered in claude_desktop_config.json
Hooks installed and executable
SQLite database initialized
Embedding model loaded (bge-small-en-v1.5)

Try It Out

Open Claude Code in any project. OMEGA loads automatically.

your first memory
# Start a new Claude Code session
$claude
[WELCOME]Session briefing ready
Profile: loaded
Recent: 0 memories (fresh install)
# Ask Claude to remember a preference:
>"Always use early returns in this project, never nest more than 2 levels"
Stored as user_preference
# Close the session and start a new one...
[WELCOME]Session briefing ready
Recent: 1 preference
Preferences: early returns, max 2 nesting levels

Ready to remember?

12 MCP tools. Semantic search. Intelligent forgetting. All local.