Setup
Configure long-term memory by using aCompositeBackend that routes the /memories/ path to a StoreBackend:
How it works
When usingCompositeBackend, deep agents maintain two separate filesystems:
1. Short-term (transient) filesystem
- Stored in the agent’s state (via
StateBackend) - Persists only within a single thread
- Files are lost when the thread ends
- Accessed through standard paths:
/notes.txt,/workspace/draft.md
2. Long-term (persistent) filesystem
- Stored in a LangGraph Store (via
StoreBackend) - Persists across all threads and conversations
- Survives agent restarts
- Accessed through paths prefixed with
/memories/:/memories/preferences.txt
Path routing
TheCompositeBackend routes file operations based on path prefixes:
- Files with paths starting with
/memories/are stored in the Store (persistent) - Files without this prefix remain in transient state
- All filesystem tools (
ls,read_file,write_file,edit_file) work with both
Cross-thread persistence
Files in/memories/ can be accessed from any thread:
Use cases
User preferences
Store user preferences that persist across sessions:Self-improving instructions
An agent can update its own instructions based on feedback:Knowledge base
Build up knowledge over multiple conversations:Research projects
Maintain research state across sessions:Store implementations
Any LangGraphBaseStore implementation works:
InMemoryStore (development)
Good for testing and development, but data is lost on restart:PostgresStore (production)
For production, use a persistent store:Best practices
Use descriptive paths
Organize persistent files with clear paths:Document the memory structure
Tell the agent what’s stored where in your system prompt:Prune old data
Implement periodic cleanup of outdated persistent files to keep storage manageable.Choose the right storage
- Development: Use
InMemoryStorefor quick iteration - Production: Use
PostgresStoreor other persistent stores - Multi-tenant: Consider using assistant_id-based namespacing in your store
Connect these docs programmatically to Claude, VSCode, and more via MCP for real-time answers.