MCP Memory Service
An MCP server providing semantic memory and persistent storage capabilities for Claude Desktop using ChromaDB and sentence transformers. This service enables long-term memory storage with semantic search capabilities, making it ideal for maintaining context across conversations and instances. For personal use only. No user management is provided.
Features
- Semantic search using sentence transformers
- Tag-based memory retrieval system
- Persistent storage using ChromaDB
- Automatic database backups
- Memory optimization tools
- Exact match retrieval
- Debug mode for similarity analysis
- Database health monitoring
- Duplicate detection and cleanup
- Customizable embedding model
Available Tools and Operations
The list covers all the core functionalities exposed through the MCP serverβs tools, organized by their primary purposes and use cases. Each category represents a different aspect of memory management and interaction available through the system.
Core Memory Operations
-
store_memory
- Store new information with optional tags
- Parameters:
- content: String (required)
- metadata: Object (optional)
- tags: Array or list of strings
- type: String
-
retrieve_memory
- Perform semantic search for relevant memories
- Parameters:
- query: String (required)
- n_results: Number (optional, default: 5)
-
search_by_tag
- Find memories using specific tags
- Parameters:
- tags: Array of strings (required)
Advanced Operations
-
exact_match_retrieve
- Find memories with exact content match
- Parameters:
- content: String (required)
-
debug_retrieve
- Retrieve memories with similarity scores
- Parameters:
- query: String (required)
- n_results: Number (optional)
- similarity_threshold: Number (optional)
Database Management
-
create_backup
- Create database backup
- Parameters: None
-
get_stats
- Get memory statistics
- Returns: Database size, memory count
-
optimize_db
- Optimize database performance
- Parameters: None
-
check_database_health
- Get database health metrics
- Returns: Health status and statistics
-
check_embedding_model
- Verify model status
- Parameters: None
Memory Management
-
delete_memory
- Delete specific memory by hash
- Parameters:
- content_hash: String (required)
-
delete_by_tag
- Delete all memories with specific tag
- Parameters:
- tag: String (required)
-
cleanup_duplicates
- Remove duplicate entries
- Parameters: None
Performance and Maintenance
- Default similarity threshold: 0.7
- Maximum recommended memories per query: 10
- Automatic optimization at 10,000 memories
- Daily automatic backups with 7-day retention
- Regular database health monitoring recommended
- Cloud storage sync must complete before access
- Debug mode available for troubleshooting
Installation
Installing via Smithery
To install Memory Service for Claude Desktop automatically via Smitheryβ:
npx -y @smithery/cli install @doobidoo/mcp-memory-service --client claude
Manual Installation
- Create Python virtual environment:
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
- Install dependencies:
pip install -r requirements.txt
uv add mcp
pip install -e .
Usage
- Start the server:(for testing purposes)
python src/test_management.py
Isaolated test for methods
python src/chroma_test_isolated.py
Claude MCP configuration
Add the following to your claude_desktop_config.json
file:
{
"memory": {
"command": "uv",
"args": [
"--directory",
"your_mcp_memory_service_directory", # e.g., "C:\\REPOSITORIES\\mcp-memory-service",
"run",
"memory"
],
"env": {
"MCP_MEMORY_CHROMA_PATH": "your_chroma_db_path", # e.g., "C:\\Users\\John.Doe \\AppData\\Local\\mcp-memory\\chroma_db",
"MCP_MEMORY_BACKUPS_PATH": "your_backups_path" # e.g., "C:\\Users\\John.Doe \\AppData\\Local\\mcp-memory\\backups"
}
}
}
Storage Structure and Settings
../your_mcp_memory_service_directory/mcp-memory/
βββ chroma_db/ # Main vector database
βββ backups/ # Automatic backups
Configure through environment variables:
CHROMA_DB_PATH: Path to ChromaDB storage
BACKUP_PATH: Path for backups
AUTO_BACKUP_INTERVAL: Backup interval in hours (default: 24)
MAX_MEMORIES_BEFORE_OPTIMIZE: Threshold for auto-optimization (default: 10000)
SIMILARITY_THRESHOLD: Default similarity threshold (default: 0.7)
MAX_RESULTS_PER_QUERY: Maximum results per query (default: 10)
BACKUP_RETENTION_DAYS: Number of days to keep backups (default: 7)
LOG_LEVEL: Logging level (default: INFO)
Sample Use Cases
Semantic requests:
Call by tool name:
Performance and Maintenance
- Default similarity threshold: 0.7
- Maximum recommended memories per query: 10
- Automatic optimization at 10,000 memories
- Daily automatic backups with 7-day retention
- Regular database health monitoring recommended
- Cloud storage sync must complete before access
- Debug mode available for troubleshooting
Testing
The project includes test suites for verifying the core functionality:
# Install test dependencies
pip install pytest pytest-asyncio
# Run all tests
pytest tests/
# Run specific test categories
pytest tests/test_memory_ops.py
pytest tests/test_semantic_search.py
pytest tests/test_database.py
Test scripts are available in the tests/
directory:
test_memory_ops.py
: Tests core memory operations (store, retrieve, delete)test_semantic_search.py
: Tests semantic search functionality and similarity scoringtest_database.py
: Tests database operations (backup, health checks, optimization)
Each test file includes:
- Proper test fixtures for server setup and teardown
- Async test support using pytest-asyncio
- Comprehensive test cases for the related functionality
- Error case handling and validation
Project Structure
../your_mcp_memory_service_directory/src/mcp_memory_service/
βββ __init__.py
βββ config.py
βββ models/
β βββ __init__.py
β βββ memory.py # Memory data models
βββ storage/
β βββ __init__.py
β βββ base.py # Abstract base storage class
β βββ chroma.py # ChromaDB implementation
βββ utils/
β βββ __init__.py
β βββ db_utils.py # Database utility functions
β βββ debug.py # Debugging utilities
β βββ hashing.py # Hashing utilities
βββ config.py # Configuration utilities
βββserver.py # Main MCP server
Additional Stuff for Development
../your_mcp_memory_service_directory
βββ scripts/
β βββ migrate_tags.py # Tag migration script
β βββ repair_memories.py # Memory repair script
β βββ validate_memories.py # Memory validation script
βββ tests/
βββ __init__.py
βββ test_memory_ops.py
βββ test_semantic_search.py
βββ test_database.py
Required Dependencies
chromadb==0.5.23
sentence-transformers>=2.2.2
tokenizers==0.20.3
websockets>=11.0.3
pytest>=7.0.0
pytest-asyncio>=0.21.0
Important Notes
- When storing in cloud, always ensure iCloud or other Cloud Drives sync is complete before accessing from another device
- Regular backups are crucial when testing new features
- Monitor ChromaDB storage size and optimize as needed
- The service includes automatic backup functionality that runs every 24 hours(tbd)
- Debug mode is available for troubleshooting semantic search results
- Memory optimization runs automatically when database size exceeds configured thresholds
Performance Considerations
- Default similarity threshold for semantic search: 0.7
- Maximum recommended memories per query: 10
- Automatic optimization triggers at 10,000 memories
- Backup retention policy: 7 days
Troubleshooting
- Check logs in
..\Claude\logs\mcp-server-memory.log
- Use
debug_retrieve
for search issues - Monitor database health with
check_database_health
- Use
exact_match_retrieve
for precise matching
Development Guidelines
- Follow PEP 8
- Use type hints
- Include docstrings for all functions and classes
- Add tests for new features
Pull Request Process
- Create a feature branch
- Add tests for new functionality
- Update documentation
- Submit PR with description of changes
License
MIT License - See LICENSE file for details
Acknowledgments
- ChromaDB team for the vector database
- Sentence Transformers project for embedding models
- MCP project for the protocol specification
Contact
Statement of Gratitude
A special thanks to God, my ultimate source of strength and guidance, and to my wife for her unwavering patience and support throughout this project. Iβd also like to express my gratitude to Claude from Antrophic for his invaluable contributions and expertise. This project wouldnβt have been possible without your collective support.