Self-Host Mem0: Open-Source Persistent AI Memory Layer

cover

What is Mem0?

Mem0 is an open-source memory layer that gives AI assistants persistent, contextual memory. It uses LLMs to extract and manage key facts from conversations and a vector store (plus optional graph store) to retrieve relevant memories later. Mem0 can be used as a standalone REST API server or via the OpenMemory MCP Server to share memory across MCP-compatible tools like Cursor and Claude Desktop.

Key Features

  • Persistent contextual memory: Store and recall user/agent facts across sessions
  • Simple APIs: Add, search, list, update, delete, and reset memories
  • Scoped memory: Organize by user_id, agent_id, and run_id (session)
  • Dual storage architecture: Vector DB for semantic retrieval; optional graph DB for relationships
  • REST API server (FastAPI): Built-in OpenAPI docs at /docs
  • MCP-compatible (OpenMemory): Local-first MCP server + UI for cross-tool memory
  • Flexible backends: Works with popular LLMs and embedding providers

This starts the REST API with Postgres pgvector, Neo4j (for graph memory), and a history DB.

# Clone the repo
git clone https://github.com/mem0ai/mem0
cd mem0/server

# Create environment
cp .env.example .env  # if present
# Or create .env with at least:
# OPENAI_API_KEY=your-openai-api-key

# Start services
docker compose up -d
  • API base URL: http://localhost:8888
  • OpenAPI docs: http://localhost:8888/docs

Only OPENAI_API_KEY is required to get started. The compose file includes volumes for persistence during development.

Option 2: OpenMemory MCP Server (Local, with UI)

OpenMemory creates a shared, persistent memory layer for MCP-compatible tools (Cursor, Claude Desktop, Windsurf, etc.). It’s powered by Mem0 and runs fully on your machine.

# Quick start (requires Docker)
# Set your OpenAI key either globally or inline
export OPENAI_API_KEY=your_api_key

curl -sL https://raw.githubusercontent.com/mem0ai/mem0/main/openmemory/run.sh | bash
# Or in one line:
# curl -sL https://raw.githubusercontent.com/mem0ai/mem0/main/openmemory/run.sh | OPENAI_API_KEY=your_api_key bash

This launches the OpenMemory server and UI locally. The container stores data locally; for long-term persistence, follow the OpenMemory docs to configure a persistent data path.

Quick API Examples (REST Server)

Add memory

curl -X POST http://localhost:8888/add \
  -H "Content-Type: application/json" \
  -d '{
    "data": "Alice prefers Python for web development.",
    "user_id": "user_alice",
    "metadata": {"topic": "preferences", "language": "Python"}
  }'

Search memories

curl -X POST http://localhost:8888/search \
  -H "Content-Type: application/json" \
  -d '{
    "query": "What are Alice\'s language preferences?",
    "user_id": "user_alice",
    "limit": 3
  }'

List all (filtered)

curl "http://localhost:8888/get_all?user_id=user_alice&limit=5"

Update memory

curl -X PUT "http://localhost:8888/update?memory_id=YOUR_MEMORY_ID" \
  -H "Content-Type: application/json" \
  -d '{"data": "Alice also enjoys Go for systems programming."}'

Delete memory

curl -X DELETE "http://localhost:8888/delete?memory_id=YOUR_MEMORY_ID"

Delete all (by user)

curl -X DELETE "http://localhost:8888/delete_all?user_id=user_alice"

Supported Workflows

  • Customer support & CRM: Remember preferences, history, and outcomes across sessions
  • Personal assistants & tutors: Adapt to user goals, skills, and long-term context
  • MCP toolchains: Share memory across Cursor, Claude Desktop, and other MCP clients
  • Enterprise knowledge: Build institutional memory with structured retrieval

Why Self-Host Mem0?

  • Data control: Keep all memory local to your infrastructure
  • Privacy: No third-party cloud storage for sensitive context
  • Flexibility: Choose LLMs, embedders, and vector stores that fit your stack
  • Cost: Avoid per-seat or per-request platform pricing by running it yourself