Native Model Context Protocol support
Reflect Memory is a first-class MCP server that AI tools connect to automatically. No adapters, no middleware, no plumbing — just point your client at the endpoint and every memory is available instantly through the standard protocol.
Built on the open standard
Reflect Memory implements the full MCP specification so every compliant client works out of the box — no vendor SDKs required.
Streamable HTTP transport
Every request flows over a single streamable HTTP connection. Responses begin arriving immediately so large memory sets feel instantaneous, and the transport works behind corporate proxies and firewalls without special configuration.
OAuth 2.1 + Bearer auth
Authenticate with industry-standard OAuth 2.1 flows or pass a simple Bearer token. Either way, every memory read and write is scoped to your account and encrypted in transit — no API keys sitting in plaintext config files.
Works everywhere MCP does
Works with Claude natively
Claude Desktop and Claude.ai support MCP servers out of the box. Add Reflect Memory once and Claude reads and writes your memories in every conversation automatically.
Cursor remote MCP
Cursor connects to Reflect Memory as a remote MCP server. Your coding assistant remembers project conventions, architecture decisions, and debugging context across every session.
Any MCP-compatible host
Any tool that speaks the Model Context Protocol can connect — Windsurf, custom agents, internal platforms. If it supports MCP, it already supports Reflect Memory.
Connect your first AI tool in under 2 minutes
Add a single server URL, authenticate, and your AI tools share memory instantly. No SDK, no build step, no configuration files.