A Quarto presentation teaching engineers how to build applications using LLMs with the Model Context Protocol (MCP).
quarto preview-
Set up environment variables:
# Copy the example environment file cp demo/.env.example demo/.env -
Configure API keys in
demo/.env:OPENAI_API_KEY: Your OpenAI API key (get from OpenAI Dashboard)GITHUB_PERSONAL_ACCESS_TOKEN: GitHub token with gist creation permissions (see setup below)
-
Install dependencies and run:
# Install dependencies uv sync # Start the MCP server uv run demo/server.py # In another terminal, run the client uv run demo/client.py
- Go to GitHub Settings > Developer settings > Personal access tokens
- Click "Generate new token" → "Generate new token (classic)"
- Give it a descriptive name (e.g., "MCP 101 Demo")
- Select the
gistscope (required for creating gists) - Click "Generate token"
- Copy the token and add it to your
demo/.envfile
- Bridge the gap between LLMs and real-world data
- Build MCP servers with tools, resources, and prompts
- Implement authentication and progress tracking
- Navigate the growing MCP ecosystem
The demo shows ArXiv paper search integrated with GitHub gist creation through MCP.