Django REST API with automatic OpenTelemetry instrumentation, JWT authentication, Celery background tasks, and PostgreSQL integration with base14 Scout.
| Component | Version | EOL Status | Current Version |
|---|---|---|---|
| Python | 3.14 | Oct 2030 | 3.14.2 |
| Django | 5.2 LTS | Apr 2028 | 5.2.9 |
| Django REST Framework | 3.16 | Stable | 3.16.1 |
| PostgreSQL | 18 | Nov 2029 | 18.1 |
| Celery | 5.6 | Stable | 5.6.0 |
| Redis | 8 | Active | 8.0 |
| OpenTelemetry SDK | 1.39 | N/A | 1.39.1 |
Why This Matters: Production-ready Django stack with LTS support, background task processing via Celery, and comprehensive OpenTelemetry instrumentation for full observability.
- ✅ HTTP requests and responses (Django automatic instrumentation)
- ✅ Database queries (psycopg automatic instrumentation)
- ✅ Redis operations (Redis automatic instrumentation)
- ✅ Celery task execution (Celery automatic instrumentation)
- ✅ Distributed trace propagation (W3C Trace Context)
- ✅ Log export with trace correlation (OTLP logs)
- ✅ Error tracking with automatic exception capture
- ✅ PII masking at collector level (emails redacted)
- Traces: User authentication, article CRUD, favorites with custom spans
- Attributes: User ID, article slug, job metadata, error context
- Logs: Structured logs with trace correlation (trace_id, span_id)
- Metrics: HTTP metrics, auth attempts, article counts, job metrics
- Business-specific custom spans and attributes
- Advanced metrics beyond HTTP and job basics
- Custom log correlation patterns
- WebSocket instrumentation (if needed)
| Component | Package | Version |
|---|---|---|
| Python | python | 3.14 |
| Django | django | 5.2.9 |
| REST Framework | djangorestframework | 3.16.1 |
| PostgreSQL Driver | psycopg[binary] | 3.2+ |
| Background Tasks | celery | 5.6.0 |
| Redis Client | redis | 5.2+ |
| Authentication | PyJWT | 2.10+ |
| OTel SDK | opentelemetry-sdk | 1.39.1 |
| OTel Django | opentelemetry-instrumentation-django | 0.60b1 |
| OTel Celery | opentelemetry-instrumentation-celery | 0.60b1 |
| OTel Psycopg | opentelemetry-instrumentation-psycopg | 0.60b1 |
| OTel Logging | opentelemetry-instrumentation-logging | 0.60b1 |
| OTel Exporter | opentelemetry-exporter-otlp | 1.39.1 |
| WSGI Server | gunicorn | 23.0+ |
- Docker & Docker Compose - Install Docker
- base14 Scout Account - Sign up
- Python 3.14+ (for local development)
git clone https://github.com/base-14/examples.git
cd examples/python/django-postgres# Copy environment template
cp .env.example .env
# Generate a secure SECRET_KEY
openssl rand -hex 32Edit .env and update the required values.
Add these to your environment or .env file:
export SCOUT_ENDPOINT=https://your-tenant.base14.io:4318
export SCOUT_CLIENT_ID=your_client_id
export SCOUT_CLIENT_SECRET=your_client_secret
export SCOUT_TOKEN_URL=https://your-tenant.base14.io/oauth/tokendocker compose up -d# Check all services are running
docker compose ps
# Test health endpoint
curl http://localhost:8000/api/health./scripts/test-api.sh| Method | Endpoint | Description | Auth |
|---|---|---|---|
| POST | /api/register |
Register new user | No |
| POST | /api/login |
Login and get JWT token | No |
| GET | /api/user |
Get current user profile | Yes |
| POST | /api/logout |
Logout | Yes |
| Method | Endpoint | Description | Auth |
|---|---|---|---|
| GET | /api/articles/ |
List articles (paginated) | No |
| POST | /api/articles/ |
Create article | Yes |
| GET | /api/articles/{slug} |
Get single article | No |
| PUT | /api/articles/{slug} |
Update article | Yes (owner) |
| DELETE | /api/articles/{slug} |
Delete article | Yes (owner) |
| POST | /api/articles/{slug}/favorite |
Favorite article | Yes |
| DELETE | /api/articles/{slug}/favorite |
Unfavorite article | Yes |
| Method | Endpoint | Description |
|---|---|---|
| GET | /api/health |
Health check (db, redis) |
curl -X POST http://localhost:8000/api/register \
-H "Content-Type: application/json" \
-d '{"email": "alice@example.com", "name": "Alice", "password": "password123"}'Response:
{
"user": {
"id": 1,
"email": "alice@example.com",
"name": "Alice",
"bio": "",
"image": "",
"created_at": "2025-12-27T06:42:13Z"
},
"token": {
"access_token": "eyJhbGciOiJIUzI1NiIs...",
"token_type": "Bearer"
}
}curl -X POST http://localhost:8000/api/articles/ \
-H "Content-Type: application/json" \
-H "Authorization: Bearer <token>" \
-d '{"title": "My Article", "body": "Article content here", "description": "A brief description"}'Response:
{
"slug": "my-article-1735284134081",
"title": "My Article",
"description": "A brief description",
"body": "Article content here",
"author": {"id": 1, "email": "alice@example.com", "name": "Alice"},
"favorites_count": 0,
"favorited": false,
"created_at": "2025-12-27T06:42:14Z"
}curl http://localhost:8000/api/healthResponse:
{
"status": "healthy",
"components": {
"database": "healthy",
"redis": "healthy"
},
"service": {
"name": "django-postgres-celery-app",
"version": "1.0.0"
}
}Distributed traces capture the full request lifecycle:
- ✅ HTTP request handling with route attributes
- ✅ Database queries with SQL statements
- ✅ Redis operations
- ✅ Celery task execution with job context
- ✅ Custom business spans (auth, CRUD, favorites)
Distributed Tracing Example - POST /api/articles/ creates an article and triggers a Celery task, all correlated by the same trace ID:
App (django-postgres-celery-app):
→ HTTP POST /api/articles/
→ article.create span
→ apply_async/send_article_notification span
Worker (django-postgres-celery-worker):
→ run/send_article_notification span
→ job.send_article_notification span
All spans share: otelTraceID: 59e443df8f7614a5b21c11d8c8f83a8d
| Metric | Type | Description |
|---|---|---|
http_requests_total |
Counter | Total HTTP requests by method, route, status |
http_request_duration_ms |
Histogram | Request latency in milliseconds |
auth.login.attempts |
Counter | Login attempts by status (success/failed) |
articles.created |
Counter | Articles created by author |
jobs.completed |
Counter | Completed background jobs |
jobs.failed |
Counter | Failed background jobs |
jobs.duration_ms |
Histogram | Job execution time |
All logs are exported via OTLP with trace correlation:
LogRecord:
SeverityText: INFO
Body: "User registered: al***@example.com"
Attributes:
→ otelTraceID: "59e443df8f7614a5b21c11d8c8f83a8d"
→ otelSpanID: "867d079fdf8865b7"
→ code.file.path: "/app/apps/users/views.py"
→ code.function.name: "register"
→ code.line.number: 40
Log messages from app and worker are correlated by trace ID, enabling end-to-end debugging across services.
The telemetry is initialized in apps/core/telemetry.py:
from opentelemetry import trace, metrics, _logs
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.metrics import MeterProvider
from opentelemetry.sdk._logs import LoggerProvider, LoggingHandler
from opentelemetry.instrumentation.django import DjangoInstrumentor
from opentelemetry.instrumentation.psycopg import PsycopgInstrumentor
from opentelemetry.instrumentation.celery import CeleryInstrumentor
from opentelemetry.instrumentation.logging import LoggingInstrumentor
# Auto-instrument frameworks
DjangoInstrumentor().instrument()
PsycopgInstrumentor().instrument()
CeleryInstrumentor().instrument()
LoggingInstrumentor().instrument(set_logging_format=True)For Celery workers, telemetry is initialized per-worker process via worker_process_init signal in config/celery.py.
from apps.core.telemetry import get_tracer
tracer = get_tracer(__name__)
with tracer.start_as_current_span("article.create") as span:
span.set_attribute("user.id", user.id)
span.set_attribute("article.slug", article.slug)
# ... business logicfrom apps.core.telemetry import get_meter
meter = get_meter(__name__)
articles_created = meter.create_counter("articles.created")
articles_created.add(1, {"author_id": str(user.id)})Email addresses are automatically masked at the collector level using the transform processor:
# config/otel-config.yaml
processors:
transform/pii:
error_mode: ignore
log_statements:
- context: log
statements:
# alice@example.com → al***@example.com
- replace_pattern(body, "([a-zA-Z0-9._%+-]{2})[a-zA-Z0-9._%+-]*@...", "$$1***@$$2")
trace_statements:
- context: span
statements:
- replace_pattern(attributes["user.email"], "...", "$$1***@$$2")This ensures PII is redacted before data leaves the collector, without requiring application code changes.
| Column | Type | Description |
|---|---|---|
| id | BIGSERIAL | Primary key |
| VARCHAR(255) | Unique email | |
| password | VARCHAR(128) | Hashed password |
| name | VARCHAR(255) | Display name |
| bio | TEXT | User bio |
| image | VARCHAR(200) | Avatar URL |
| created_at | TIMESTAMP | Creation time |
| updated_at | TIMESTAMP | Last update |
| Column | Type | Description |
|---|---|---|
| id | BIGSERIAL | Primary key |
| slug | VARCHAR(255) | Unique URL slug |
| title | VARCHAR(255) | Article title |
| description | TEXT | Brief description |
| body | TEXT | Article content |
| author_id | BIGINT | FK to users |
| favorites_count | INTEGER | Cached favorite count |
| created_at | TIMESTAMP | Creation time |
| updated_at | TIMESTAMP | Last update |
| Column | Type | Description |
|---|---|---|
| id | BIGSERIAL | Primary key |
| user_id | BIGINT | FK to users |
| article_id | BIGINT | FK to articles |
| created_at | TIMESTAMP | Creation time |
django-postgres/
├── config/
│ ├── settings.py # Django settings
│ ├── celery.py # Celery configuration
│ ├── urls.py # URL routing
│ ├── wsgi.py # WSGI application
│ └── otel-config.yaml # OTel Collector config
├── apps/
│ ├── core/
│ │ ├── telemetry.py # OpenTelemetry setup
│ │ ├── middleware.py # Metrics middleware
│ │ ├── exceptions.py # Exception handlers
│ │ └── views.py # Health endpoint
│ ├── users/
│ │ ├── models.py # User model
│ │ ├── views.py # Auth endpoints
│ │ ├── serializers.py # DRF serializers
│ │ └── authentication.py # JWT auth
│ ├── articles/
│ │ ├── models.py # Article, Favorite models
│ │ ├── views.py # CRUD endpoints
│ │ └── serializers.py # DRF serializers
│ └── jobs/
│ └── tasks.py # Celery tasks
├── scripts/
│ └── test-api.sh # API test script
├── tests/
│ └── conftest.py # Pytest fixtures
├── compose.yml # Docker Compose
├── Dockerfile # Multi-stage build
└── requirements.txt # Dependencies
# Create virtual environment
python -m venv venv
source venv/bin/activate
# Install dependencies
pip install -r requirements.txt
# Start infrastructure services
docker compose up -d postgres redis otel-collector
# Run migrations
python manage.py migrate
# Start development server
python manage.py runserver
# Start Celery worker (separate terminal)
celery -A config worker --loglevel=info# API integration tests
./scripts/test-api.sh
# Unit tests with pytest
pytest tests/ -v
# With coverage
pytest tests/ --cov=apps --cov-report=html# Build and start all services
docker compose up -d --build
# View logs
docker compose logs -f app celery
# Execute Django commands
docker compose exec app python manage.py migrate
docker compose exec app python manage.py createsuperuser
# Stop services
docker compose down
# Clean up volumes
docker compose down -vAfter starting the application and generating some traffic:
- Log in to base14 Scout
- Navigate to Services → django-postgres-celery-app
- View distributed traces, metrics, and logs
- Explore the service map to see dependencies
# Check health status
docker compose ps
docker compose logs postgres
docker compose logs redis# Verify PostgreSQL is ready
docker compose exec postgres pg_isready -U postgres
# Check connection from app
docker compose exec app python -c "from django.db import connection; connection.ensure_connection()"# Check worker logs
docker compose logs celery
# Verify Redis connection
docker compose exec redis redis-cli ping# Check collector health
curl http://localhost:13133/health
# View collector logs
docker compose logs otel-collector
# Verify credentials are set
echo $SCOUT_CLIENT_ID