Skip to main content
Supermemory MCP Server 4.0 gives AI assistants (Claude, Cursor, Windsurf, etc.) persistent memory across conversations. Built on Cloudflare Workers with Durable Objects for scalable, persistent connections.

Quick Install

npx -y install-mcp@latest https://mcp.supermemory.ai/mcp --client claude --oauth=yes
Replace claude with your MCP client: cursor, windsurf, vscode, etc.

Manual Configuration

Add to your MCP client config:
{
  "mcpServers": {
    "supermemory": {
      "url": "https://mcp.supermemory.ai/mcp"
    }
  }
}
The server uses OAuth by default. Your client will discover the authorization server via /.well-known/oauth-protected-resource and prompt you to authenticate.

API Key Authentication (Alternative)

If you prefer API keys over OAuth, get one from app.supermemory.ai and pass it in the Authorization header:
{
  "mcpServers": {
    "supermemory": {
      "url": "https://mcp.supermemory.ai/mcp",
      "headers": {
        "Authorization": "Bearer sm_your_api_key_here"
      }
    }
  }
}
API keys start with sm_ and skip OAuth when provided.

Project Scoping

Scope all operations to a specific project with x-sm-project:
{
  "mcpServers": {
    "supermemory": {
      "url": "https://mcp.supermemory.ai/mcp",
      "headers": {
        "x-sm-project": "your-project-id"
      }
    }
  }
}

Tools

memory

Save or forget information about the user.
ParameterTypeRequiredDescription
contentstringYesThe memory content to save or forget
action"save" | "forget"NoDefault: "save"
containerTagstringNoProject tag to scope the memory

recall

Search memories and get user profile.
ParameterTypeRequiredDescription
querystringYesSearch query to find relevant memories
includeProfilebooleanNoInclude user profile summary. Default: true
containerTagstringNoProject tag to scope the search

whoAmI

Get the current logged-in user’s information. Returns { userId, email, name, client, sessionId }.

Resources

URIDescription
supermemory://profileUser profile with stable preferences and recent activity
supermemory://projectsList of available memory projects

Prompts

context

Inject user profile and preferences as system context for AI conversations. Returns a formatted message with the user’s stable preferences and recent activity. You can access this in Cursor and Claude Code by just doing /context, which will give the LLMs just enough context to use and query supermemory more. Purpose: Unlike the recall tool (which searches for specific information) or the profile resource (which returns raw data), the context prompt provides a pre-formatted system message designed for context injection at the start of conversations.
ParameterTypeRequiredDescription
containerTagstringNoProject tag to scope the profile (max 128 chars)
includeRecentbooleanNoInclude recent activity in the profile. Default: true
Output format:
  • Includes instructions to save new memories using the memory tool
  • Stable Preferences: Long-term user facts and preferences
  • Recent Activity: Recent interactions and context (when includeRecent is true)
  • Fallback message when no profile exists yet
When to use:
  • Use context prompt for automatic system context injection at conversation start
  • Use recall tool when you need to search for specific information
  • Use profile resource when you need raw profile data for custom processing

MCP Server Source Code

View the open-source implementation