MCP Server Integration
Use Papr through MCP so AI clients can store and retrieve memory with consistent tool contracts.
What You Build
- A configured MCP server connected to your Papr project
- A stable tool surface for memory read and write operations
- A repeatable setup for local development and production clients
Prerequisites
- Papr API key in
PAPR_MEMORY_API_KEY - A client that supports MCP configuration
- Access to the Papr MCP package or server repo
Minimal Setup
- Install and configure the MCP server.
- Add your API key to the MCP runtime environment.
- Register the server in your MCP client config.
- Restart the client and verify tools are available.
Recommended Tool Contract
Expose a minimal, predictable set of tools:
store_memorysearch_memorycompress_session_contextquery_graph
Keep parameter names aligned with the API:
external_user_idmemory_policyenable_agentic_graph
Example Memory Calls
Store memory
{
"content": "User prefers weekly release summaries by email.",
"external_user_id": "user_123",
"memory_policy": {
"mode": "auto",
"consent": "implicit",
"risk": "none"
}
}Search memory
{
"query": "What communication preference does the user have?",
"external_user_id": "user_123",
"enable_agentic_graph": true
}Use response_format=toon when search output is passed directly to an LLM.
Production Settings
- Keep tenant scope explicit with
organization_idandnamespace_id - Set conservative limits (
max_memories,max_nodes) for predictable latency - Log tool input and output metadata for traceability
- Add retry logic for transient network failures
Validation Checklist
- MCP client lists Papr tools after startup
store_memoryreturns success for a test payloadsearch_memoryreturns user-scoped results- Session compression tool returns
context_for_llmfor long threads
Troubleshooting
If tools fail to load or requests fail at runtime, use Error Playbook and verify MCP runtime environment variables are set correctly.