Complete guide for using progressive loading pattern with MCP tools.
Progressive loading generates one TypeScript file per MCP tool, achieving 98% token savings by loading only the tools you need. Instead of loading 30,000 tokens for all tools, you load 500-1,500 tokens per tool.
# Generate TypeScript files for GitHub MCP server
mcp-execution-cli generate docker \
--arg=run --arg=-i --arg=--rm \
--arg=-e --arg=GITHUB_PERSONAL_ACCESS_TOKEN \
--arg=ghcr.io/github/github-mcp-execution-server \
--env=GITHUB_PERSONAL_ACCESS_TOKEN=github_pat_YOUR_TOKEN \
--name=github
# Files are created in: ~/.claude/servers/github/Create ~/.claude/mcp.json with your server configurations:
{
"mcpServers": {
"github": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"GITHUB_PERSONAL_ACCESS_TOKEN",
"ghcr.io/github/github-mcp-execution-server"
],
"env": {
"GITHUB_PERSONAL_ACCESS_TOKEN": "github_pat_YOUR_TOKEN_HERE"
}
}
}
}See mcp.json.example for more server examples.
The runtime bridge requires the MCP SDK:
npm install @modelcontextprotocol/sdkLoad only the tools you need:
// Load only the specific tools needed
import { createIssue } from '~/.claude/servers/github/createIssue.js';
import { listIssues } from '~/.claude/servers/github/listIssues.js';
// Call tools with type-safe parameters
const issue = await createIssue({
owner: 'myorg',
repo: 'myrepo',
title: 'Bug: Login button not working',
body: 'Steps to reproduce:\n1. Navigate to /login\n2. Click login button\n3. Nothing happens'
});
console.log(`Created issue #${issue.number}`);~/.claude/servers/github/
├── _runtime/
│ └── mcp-bridge.ts # Runtime bridge for MCP communication
├── index.ts # Re-exports all tools (not recommended)
├── createIssue.ts # Individual tool (500 tokens)
├── updateIssue.ts # Individual tool (500 tokens)
├── listIssues.ts # Individual tool (500 tokens)
└── ... # 37 more tools
// GOOD: Load only what you need (500 tokens)
import { createIssue } from '~/.claude/servers/github/createIssue.js';// BAD: Loads everything (30,000 tokens)
import * as github from '~/.claude/servers/github/index.js';| Pattern | Tools Loaded | Tokens | Savings |
|---|---|---|---|
| Traditional | All 40 tools | ~30,000 | 0% |
| Progressive | 1 tool | ~500 | 98% |
| Progressive | 3 tools | ~1,500 | 95% |
| Progressive | 10 tools | ~5,000 | 83% |
{
"github": {
"command": "docker",
"args": [
"run", "-i", "--rm",
"-e", "GITHUB_PERSONAL_ACCESS_TOKEN",
"ghcr.io/github/github-mcp-execution-server"
],
"env": {
"GITHUB_PERSONAL_ACCESS_TOKEN": "github_pat_..."
}
}
}{
"filesystem": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"/path/to/allowed/directory"
]
}
}{
"postgres": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-postgres",
"postgresql://localhost/mydb"
]
}
}The _runtime/mcp-bridge.ts file provides the connection between generated TypeScript and MCP servers.
- Connection Caching: Reuses MCP client connections across multiple tool calls
- Automatic Configuration: Reads from
~/.claude/mcp.json - Error Handling: Clear error messages for debugging
- Type Safety: Full TypeScript type checking
Calls an MCP tool with given parameters.
const result = await callMCPTool('github', 'create_issue', {
owner: 'myorg',
repo: 'myrepo',
title: 'Bug report',
body: 'Description'
});Closes all cached connections (call during shutdown).
// Cleanup on exit
process.on('SIGINT', async () => {
await closeAllConnections();
process.exit(0);
});import { createIssue } from '~/.claude/servers/github/createIssue.js';
const issue = await createIssue({
owner: 'modelcontextprotocol',
repo: 'specification',
title: 'Feature request: Support for streaming responses',
body: 'It would be great to support streaming for long-running operations...',
labels: ['enhancement', 'discussion']
});
console.log(`Issue created: ${issue.html_url}`);import { listPullRequests } from '~/.claude/servers/github/listPullRequests.js';
const prs = await listPullRequests({
owner: 'modelcontextprotocol',
repo: 'specification',
state: 'open',
perPage: 10
});
prs.forEach(pr => {
console.log(`#${pr.number}: ${pr.title} by ${pr.user.login}`);
});import { searchCode } from '~/.claude/servers/github/searchCode.js';
const results = await searchCode({
query: 'language:rust progressive loading',
perPage: 5
});
results.items.forEach(item => {
console.log(`${item.repository.full_name}:${item.path}`);
});Solution: Create ~/.claude/mcp.json with your server configurations.
cp examples/mcp.json.example ~/.claude/mcp.json
# Edit with your tokens and pathsSolution: Add the server to your mcp.json:
{
"mcpServers": {
"xyz": {
"command": "...",
"args": [...]
}
}
}Solution: Install the MCP SDK:
npm install @modelcontextprotocol/sdkProgressive loading is optimized for:
- Generation: ~2ms to generate 42 files (TypeScript code)
- Loading: 500-1,500 tokens per tool vs 30,000 for all tools
- Runtime: Connections cached and reused across calls
mcp-execution-cli generate <server> [OPTIONS]Options:
--name <NAME>- Custom directory name (default: server command)--arg <ARG>- Server command argument (repeatable)--env <KEY=VALUE>- Environment variable (repeatable)--progressive-output <DIR>- Custom output directory
# GitHub via Docker with custom name
mcp-execution-cli generate docker \
--arg=run --arg=-i --arg=--rm \
--arg=ghcr.io/github/github-mcp-execution-server \
--name=github
# Filesystem via npx
mcp-execution-cli generate npx \
--arg=-y \
--arg=@modelcontextprotocol/server-filesystem \
--arg=/path/to/dir \
--name=filesystem
# Custom output directory
mcp-execution-cli generate docker \
--arg=... \
--progressive-output=/tmp/my-tools┌─────────────────────┐
│ Generated Tools │ 500-1,500 tokens each
│ (createIssue.ts) │ Type-safe interfaces
└──────────┬──────────┘
│
↓
┌─────────────────────┐
│ Runtime Bridge │ Connection management
│ (mcp-bridge.ts) │ Parameter serialization
└──────────┬──────────┘
│
↓
┌─────────────────────┐
│ MCP SDK │ Official @modelcontextprotocol/sdk
│ (StdioTransport) │ stdio communication
└──────────┬──────────┘
│
↓
┌─────────────────────┐
│ MCP Server │ GitHub, Filesystem, etc.
│ (Docker/npx) │ Tool execution
└─────────────────────┘