Local-first AI Memory Layer with MCP Support
Your memories, your control. Privacy-focused AI memory that works across any MCP-compatible client.
🚀 Quick Start • 📖 Documentation • 🔌 MCP Setup • 🛠️ Development
- 🔒 Privacy First - All data stays on your server using official Mem0 setup
- 🌐 Universal Memory - Works with Claude, Cursor, Windsurf, and all MCP clients
- ⚡ Lightning Fast - Vector-based semantic search with Qdrant + PostgreSQL
- 🐳 One-Click Deploy - Uses Mem0's official easy setup script
- 🔄 Auto-Deploy - GitHub Actions integration for seamless updates
- 📊 Built-in Dashboard - Official Mem0 web UI for memory management
- 🏗️ Production Ready - Mem0's battle-tested architecture
- 📈 Monitoring Ready - Built-in health checks and logging
# Clone and run the installer
git clone https://github.com/yourusername/openmemory-mcp-coolify.git
cd openmemory-mcp-coolify
chmod +x coolify-install.sh
# Set required environment variables
export OPENAI_API_KEY="your-openai-api-key"
export DOMAIN="memory.yourdomain.com"
# Run the installer
./coolify-install.sh# Clone repository
git clone https://github.com/yourusername/openmemory-mcp-coolify.git
cd openmemory-mcp-coolify
# Create environment file
cp .env.example .env
# Edit .env with your configuration
# Deploy with Docker Compose
docker-compose -f .coolify/docker-compose.yml up -d- Docker & Docker Compose
- OpenAI API Key (Get one here)
- Domain name (optional, can use localhost)
- Coolify instance (for production deployment)
| Variable | Required | Default | Description |
|---|---|---|---|
OPENAI_API_KEY |
✅ | - | OpenAI API key for embeddings |
DOMAIN |
❌ | localhost | Your domain name |
APP_NAME |
❌ | openmemory | Application name prefix |
LOG_LEVEL |
❌ | INFO | Logging level (DEBUG, INFO, WARNING, ERROR) |
MAX_MEMORY_SIZE |
❌ | 1000 | Maximum number of memories |
EMBEDDING_MODEL |
❌ | text-embedding-3-small | OpenAI embedding model |
Add to your Claude Desktop configuration:
{
"mcpServers": {
"openmemory": {
"transport": "sse",
"url": "https://your-domain.com/mcp/claude/sse/YOUR_USERNAME"
}
}
}Add to your Cursor configuration:
{
"mcpServers": {
"openmemory": {
"transport": "sse",
"url": "https://your-domain.com/mcp/cursor/sse/YOUR_USERNAME"
}
}
}npx install-mcp i "https://your-domain.com/mcp/claude/sse/YOUR_USERNAME" --client claude- Go to your Coolify dashboard
- Create new project: "OpenMemory MCP"
- Choose "Git Repository" as source
- Set repository URL:
https://github.com/yourusername/openmemory-mcp-coolify.git
- Docker Compose File:
.coolify/docker-compose.yml - Environment: Set required variables
- Domain: Configure your domain with SSL
Push to main branch or trigger manual deployment from Coolify dashboard.
| Service | Port | Description | Health Check |
|---|---|---|---|
| UI | 3000 | React dashboard for memory management | GET / |
| API | 8765 | FastAPI backend with MCP endpoints | GET /health |
| Qdrant | 6333 | Vector database for semantic search | GET /health |
| Redis | 6379 | Optional caching layer | PING |
# Deploy/Update
./scripts/deploy.sh
# Check status
./scripts/status.sh
# View logs
./scripts/logs.sh [service-name]
# Backup data
./scripts/backup.sh
# Stop services
./scripts/stop.shThis project is licensed under the MIT License - see the LICENSE file for details.
- Mem0 - Original OpenMemory implementation
- Model Context Protocol - MCP specification
- Coolify - Self-hosted deployment platform
Made with ❤️ for the AI community
⭐ Star this repository if you find it helpful!