Explore music scenes as graphs using web search, Spotify metadata, and an LLM-driven extraction pipeline. The app builds a network of artists and renders it in an interactive web UI, along with a narrative report.
- Input: a seed query describing a music scene (e.g., "UK '70s post punk").
- Pipeline: web search → artist extraction → iterative expansion via related artists → Spotify genres context → graph building → final report generation.
- Output: an interactive graph (nodes = artists; links = undirected relations) and a short report summarizing the scene.
-
Backend
- FastAPI: HTTP API and static file hosting.
- Uvicorn: ASGI server.
- LangChain + LangChain Community: Prompting and model integration.
- LangGraph: Orchestrates the multi-step workflow as a state machine/graph.
- Ollama Chat via
ChatOllama: Local LLM runtime (configurable via env). - Tavily: Web search API for scene discovery and related artists content.
- Spotipy: Spotify Web API client (used here for artist genres).
-
Frontend
- D3.js: Force-directed graph visualization with zoom/pan, drag, pin/unpin, tooltips.
src/main.py: FastAPI application (API + static hosting).src/agent.py: LangGraph workflow nodes and wiring (bootstrap search → iterate → report).src/tools.py: Tavily and Spotify tools for search and metadata.src/llm.py: LLM configuration (Ollama chat model; JSON mode for structured extraction).src/state.py: Shared state type used by the workflow.static/index.html: Web UI (graph viewer + controls + final report).requirements.txt: Python dependencies.
-
Bootstrap scene discovery
- Tavily search for "main artists for music scene".
- LLM extracts artist names from the returned text (JSON-structured extraction).
- Initialize the graph with these artists as nodes.
-
Iterative expansion
- Pop one artist from the queue as
current_artist. - Fetch context:
- Spotify genres (via Spotipy; client-credentials flow).
- Tavily web results for "artists related to or influenced by <current_artist>".
- LLM extracts:
new_artists: new nodes to add to queue/graph.new_links: undirected links{source, target, type, confidence}(defaults:type="related",confidence≈0.6if not provided).
- Continue until the queue empties or
max_loopsis reached.
- Pop one artist from the queue as
-
Final report
- LLM generates a short narrative report based on the full graph and the seed query.
- Links are currently modeled as undirected relations in the UI (no arrowheads).
- Each link carries:
type: relation type (default"related").confidence: a float in[0,1]; the UI maps this to stroke width.
- Tooltips show
typeandconfidenceon hover.
Request body:
{
"seed_query": "UK '70s post punk",
"max_loops": 10
}Response body (shape):
{
"graph_data": {
"nodes": [{ "id": "Joy Division" }, { "id": "Bauhaus" }],
"links": [
{ "source": "Joy Division", "target": "Bauhaus", "type": "related", "confidence": 0.7, "weight": 0.7 }
]
},
"report": "Joy Division are considered among the most iconic bands..."
}- Served at
/(and assets under/static). - Features:
- Form to submit a seed query and loop count.
- Force graph with zoom/pan, drag, pin/unpin, neighbor highlighting, window resize handling.
- Link tooltips showing relation type and confidence; node tooltip shows name (and genres when available).
- Final report shown in the right sidebar.
- Python 3.10+ (tested on 3.13) and a working Ollama installation if using
ChatOllama.
python -m venv venv
# Windows PowerShell
venv\\Scripts\\Activate.ps1
# macOS/Linux
source venv/bin/activate
pip install -r requirements.txtCreate a .env file in the project root with:
# LLM
LLM_MODE=OLLAMA
OLLAMA_MODEL=llama3.1:latest # example model; ensure it's available in your Ollama
# Tavily
TAVILY_API_KEY=your_tavily_api_key
# Spotify (Client Credentials, you may need a Spotify Premium subscription to access these APIs)
SPOTIPY_CLIENT_ID=your_spotify_client_id
SPOTIPY_CLIENT_SECRET=your_spotify_client_secretNotes:
src/llm.pyexpectsLLM_MODE=OLLAMAandOLLAMA_MODELto be set.- If Ollama is not running or the model is missing, LLM calls will fail.
Recommended (module style):
uvicorn src.main:app --reloadAlternatively (script style):
python src/main.pyOpen the UI:
http://localhost:8000/
- End-to-end pipeline from seed query to graph + report.
- Undirected links with per-link
typeandconfidence(defaulted when the LLM cannot determine). - Interactive graph viewer with meaningful interactions and link tooltips.
- Spotify genres for context (used in prompts; future UI enhancements planned).
- Enrich link semantics (clearer types plus direction only when meaningful, e.g., influence).
- Side panel with artist details: genres, Spotify profile, top tracks, audio previews.
- Caching and rate limiting for external APIs.
- Alternative LLM backends (e.g., OpenAI, Anthropic) via LangChain configuration.
- Persist graphs and reports for later retrieval and comparison.
