Turn your data into AI memory

Take your data downloads from ChatGPT, Instagram, and more — turn them into searchable memories your AI tools can actually use. Runs locally, nothing leaves your machine.

2 min quickstart
MCP ready
Runs locally
terminal
git clone https://github.com/onfabric/context-use.git
cd context-use
uv sync && source .venv/bin/activate
context-use config set-key
context-use quickstart

Setup

1

Install

terminal
git clone https://github.com/onfabric/context-use.git
cd context-use
uv sync
source .venv/bin/activate

Requires Python 3.12+ and uv

2

Set your OpenAI key

terminal
context-use config set-key
# or: export OPENAI_API_KEY=sk-...
3

Run the quickstart

terminal
context-use quickstart

Writes memories to data/output/ — no database needed.

4

Full pipeline

terminal
# Set up PostgreSQL (one-time)
context-use config set-store postgres

# Run the pipeline
context-use pipeline

# Search your memories
context-use memories search "hiking trips in 2024"

Persistent storage with semantic search via PostgreSQL.

5

MCP server

terminal
python -m context_use.ext.mcp_use.run

Add to your Claude Desktop or Cursor config:

mcp-config.json
{
  "mcpServers": {
    "context-use": {
      "command": "python",
      "args": ["-m", "context_use.ext.mcp_use.run", "--transport", "stdio"]
    }
  }
}
6

Personal agent

terminal
context-use config set-agent adk
context-use agent synthesise
context-use agent profile
context-use agent ask "What topics do I keep coming back to?"

Data sources

ChatGPT

Conversations

Instagram

Stories, Reels, Posts

Soon

WhatsApp

Messages

Soon

Google Takeout

Search, YouTube, Maps

New integrations can be contributed by the community.

Your data. Your memory. Your AI.

Export your data, generate portable memories, and give your AI tools real context about you.