Standalone MCP server for semantic search over your markdown knowledge vault. Generates embeddings via Ollama with automatic indexing and live file watching.
Ollama must be running on your system. Install from ollama.com, then pull the embedding model:
ollama pull nomic-embed-textnpm install
VAULT_PATH=/path/to/your/vault bun run startServer runs on http://0.0.0.0:3939 by default.
On first run, Synapse will:
- Scan your vault for markdown files
- Generate embeddings via Ollama
- Store embeddings in SQLite database
- Start a file watcher for live updates
| Variable | Required | Default | Description |
|---|---|---|---|
VAULT_PATH |
Yes | - | Path to markdown vault |
ENV_PATH |
No | $VAULT_PATH/.synapse |
Where embeddings.db is stored |
MCP_PORT |
No | 3939 |
HTTP port |
MCP_HOST |
No | 0.0.0.0 |
Bind address |
OLLAMA_URL |
No | http://localhost:11434 |
Ollama API endpoint |
OLLAMA_MODEL |
No | nomic-embed-text |
Embedding model name |
claude mcp add --scope user --transport sse synapse http://<remote-host>:3939/sseReplace <remote-host> with the IP/hostname of the machine running the server.
| Endpoint | Method | Description |
|---|---|---|
/sse |
GET | SSE connection for MCP |
/messages?sessionId=X |
POST | Client-to-server messages |
/health |
GET | Health check |
search_notes- Keyword search in note pathsget_similar_notes- Semantic similarity using embeddingsget_connection_graph- Multi-level connection exploration
Initial indexing: On startup, Synapse scans your vault and generates embeddings for all markdown files using the specified Ollama model. These embeddings are stored in a local SQLite database.
Live updates: The built-in file watcher automatically detects changes, additions, and deletions in your vault, re-embedding modified files in real-time.
Semantic search: Tools use embeddings to find semantically related notes and build multi-hop connection graphs across your vault.