A FastAPI service that supplements song requests with structured metadata, album artwork, and library catalog information. Built to enhance music request workflows with automated data enrichment and Slack integration.
- Smart Song Parsing: Uses Groq AI to extract structured metadata from natural language song requests
- Album Artwork Lookup: Fetches album artwork from Discogs
- Library Catalog Search: Full-text search across a local SQLite music library database
- Slack Integration: Posts enriched song data to Slack with embedded artwork
- Fast API: Built with FastAPI for high performance and automatic API documentation
- Python 3.12 or higher
- pip (Python package installer) or use the included
pyproject.tomlfor modern package management
git clone <repository-url>
cd request-parserpython -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activateOption A: Using requirements.txt
pip install -r requirements.txtOption B: Using pyproject.toml (Recommended)
pip install -e .Option C: With development dependencies
pip install -e ".[dev]"Copy the example environment file and update with your values:
cp .env.example .envThen edit .env with your actual configuration:
# Required
GROQ_API_KEY=your_groq_api_key_here
# Optional - Artwork Lookup
DISCOGS_TOKEN=your_discogs_token_here
# Optional - Slack Integration
SLACK_WEBHOOK_URL=https://hooks.slack.com/services/YOUR/WEBHOOK/URL
# Optional - Telemetry
POSTHOG_API_KEY=your_posthog_project_api_key
POSTHOG_HOST=https://us.i.posthog.com
# Application Configuration
LOG_LEVEL=INFO
PORT=8000
# Feature Flags
ENABLE_SLACK_INTEGRATION=true
ENABLE_ARTWORK_LOOKUP=true- GROQ_API_KEY: Sign up at Groq (not Grok) to get an API key
- DISCOGS_TOKEN: Create a personal access token at Discogs Settings
- SLACK_WEBHOOK_URL: Create an incoming webhook in your Slack workspace's App Settings
- POSTHOG_API_KEY: Optional - Get your project API key from PostHog for telemetry tracking
The project includes a pre-built SQLite database (library.db). If you need to rebuild it or if the file is missing:
python scripts/export_to_sqlite.pypython main.pyuvicorn main:app --reload --host 0.0.0.0 --port 8000The --reload flag enables auto-reloading during development.
The application will start on http://localhost:8000
- Interactive API Documentation: http://localhost:8000/docs (Swagger UI - Try out endpoints here!)
- Read-Only Docs: http://localhost:8000/redoc (ReDoc - Beautiful documentation)
- Health Check: http://localhost:8000/health (Detailed service status)
Note: All API endpoints (except /health) are now versioned under /api/v1/ prefix.
# Build the image
docker build -t request-parser .
# Run the container
docker run -p 8000:8000 \
-e GROQ_API_KEY=your_groq_api_key \
-e DISCOGS_TOKEN=your_discogs_token \
-e SLACK_WEBHOOK_URL=your_slack_webhook \
request-parserCreate a docker-compose.yml:
version: '3.8'
services:
app:
build: .
ports:
- "8000:8000"
environment:
- GROQ_API_KEY=${GROQ_API_KEY}
- DISCOGS_TOKEN=${DISCOGS_TOKEN}
- SLACK_WEBHOOK_URL=${SLACK_WEBHOOK_URL}
env_file:
- .envThen run:
docker-compose upAll endpoints except /health are prefixed with /api/v1:
GET /health- Health check with service status detailsPOST /api/v1/parse- Parse a natural language song request into structured metadataPOST /api/v1/request- Full request workflow: parse → search library → find artwork → post to SlackPOST /api/v1/artwork- Find album artwork for a given song/album/artistGET /api/v1/library/search- Search the library catalog
Parse a message:
curl -X POST "http://localhost:8000/api/v1/parse" \
-H "Content-Type: application/json" \
-d '{"message": "Play Bohemian Rhapsody by Queen"}'Full request workflow:
curl -X POST "http://localhost:8000/api/v1/request" \
-H "Content-Type: application/json" \
-d '{"message": "Play Abele Dance (85 remix) by Manu Dibango"}'Search library:
curl "http://localhost:8000/api/v1/library/search?q=Queen+Bohemian&limit=5"Health check:
curl "http://localhost:8000/health"Run all tests (excluding integration):
pytestRun unit tests only:
pytest tests/unit/Run integration tests:
pytest tests/integration/ -m integrationRun with coverage:
pytest --cov=. --cov-report=htmlThe project is configured with modern Python tooling via pyproject.toml:
Format code:
black .Lint code:
ruff check .Fix linting issues automatically:
ruff check --fix .Type checking:
mypy .Run all quality checks:
black . && ruff check --fix . && mypy . && pytest- Create a feature branch
- Make your changes
- Run tests and linters
- Submit a pull request
The project uses:
- Pydantic Settings for type-safe configuration
- FastAPI dependency injection for clean architecture
- Async/await throughout for performance
- Comprehensive logging with structured output
- Custom exceptions for better error handling
If you see an error about library.db not being found:
python scripts/export_to_sqlite.pyEnsure your .env file exists in the project root and contains:
GROQ_API_KEY=your_actual_key_here
If port 8000 is already in use, specify a different port:
uvicorn main:app --port 8001If Slack integration fails:
- Verify your webhook URL is correct
- Check that your Slack app has proper permissions
- The app will attempt to fetch a webhook from Railway if
SLACK_WEBHOOK_URLis not set
| Variable | Required | Default | Description |
|---|---|---|---|
GROQ_API_KEY |
Yes | - | API key for Groq AI service |
DISCOGS_TOKEN |
No | - | Personal access token for Discogs API |
SLACK_WEBHOOK_URL |
No | - | Slack incoming webhook URL (fetches from Railway if not set) |
PORT |
No | 8000 | Port for the application to listen on |
HOST |
No | 0.0.0.0 | Host to bind the server to |
LOG_LEVEL |
No | INFO | Logging level (DEBUG, INFO, WARNING, ERROR) |
LIBRARY_DB_PATH |
No | library.db | Path to SQLite library database |
ENABLE_SLACK_INTEGRATION |
No | true | Enable/disable Slack notifications |
ENABLE_ARTWORK_LOOKUP |
No | true | Enable/disable artwork lookup from external APIs |
POSTHOG_API_KEY |
No | - | PostHog project API key for telemetry tracking |
POSTHOG_HOST |
No | https://us.i.posthog.com | PostHog host URL |
SENTRY_DSN |
No | - | Sentry DSN for error tracking |
DATABASE_URL_DISCOGS |
No | - | PostgreSQL URL for Discogs cache (see Discogs Cache Setup) |
The service supports an optional PostgreSQL cache for Discogs data to reduce API calls. When enabled, the service queries the local cache first and falls back to the Discogs API on cache misses.
The cache ETL pipeline (building the database from Discogs data dumps) lives in a separate repo: WXYC/discogs-cache. See that repo for full setup instructions.
To enable the cache, set the environment variable:
DATABASE_URL_DISCOGS=postgresql://user:pass@host:5432/discogsIf DATABASE_URL_DISCOGS is not set, the service uses the Discogs API directly (existing behavior).
- Dependency Injection: FastAPI's dependency injection system manages service lifecycle and makes testing easier
- Centralized Configuration: Pydantic Settings for type-safe, validated configuration
- Modular Structure: Each feature (artwork, library, parsing) is self-contained
- Async Throughout: All I/O operations use async/await for optimal performance
- Custom Exceptions: Domain-specific exceptions for better error handling and debugging
- Comprehensive Logging: Structured logging at appropriate levels throughout the application
- Hybrid Caching: Optional PostgreSQL cache built from Discogs data dumps with trigram fuzzy matching, graceful degradation to API-only mode
- Error Tracking: Sentry integration for production error monitoring with breadcrumbs for debugging
Services are managed through FastAPI's lifespan context manager:
- Database connections are established at startup
- HTTP clients are reused across requests
- Resources are properly cleaned up at shutdown
[Add your license here]
[Add contribution guidelines here]
For issues and questions, please open an issue on GitHub.