An example API backend using Amazon Aurora DSQL ✨.
This repo provides everything you need to deploy and load-test an API using a Amazon DSQL PostgreSQL backend.
- Features
- Prerequisites
- Deployment
- Creating the Database Schema
- OpenAPI specification
- Running a Load Test
- Exploring the Database using
psql - Exploring Costs 💰
- Running the API locally
- Fastify API backend using AWS Lambda and Drizzle ORM
- API Gateway REST API
- OpenAPI specification
- Load test script
- CDK Infrastructure
- A recent AWS CLI version that supports DSQL (
aws dsql). - Node.js 22 or later
- (Optional) - A PostgreSQL
psqlclient - https://www.postgresql.org/download/.
The CDK application provides two stacks:
- DsqlExampleStack - provisions the DSQL cluster
- ApplicationStack - provisions the API and Lambda backend
cd cdk
npm install
npx cdk deploy --allThe application should deploy and print the stack outputs, including ApplicationStack.apiEndpoint.
Set this to BASE_URL in your shell environment. We'll use it later when you run load tests!
export BASE_URL=https://<REST_API_ID>.execute-api.<REGION>.amazonaws.com/prodTo create the database schema, we are going to use Drizzle Kit.
DSQL does not support the SERIAL data type used for incremental Drizzle migrations, so we'll just use the push command to create the entire schema.
npm install
cd packages/db
# Set the PostgreSQL connection string
export DATABASE_URL=$(npx tsx ./packages/cli/cli.ts)
npx drizzle-kit pushYou can access the API specification at ${BASE_URL}/openapi.json.
The API provides CRUD endpoints for lists of items. Each resource is very simple, containing just an ID (UUID v4) and a name.
Explore it with the Redocly CLI like this:
npx @redocly/cli preview-docs ${BASE_URL}/openapi.jsonAn excellent way of generating API load is using k6.
You will need to follow the instructions to install k6 locally.
Once that's done, you can run the default load test which simulates 100 virtual users (VUs) over a period of one minute.
✋ Make sure you have set DATABASE_URL in your shell as we mentioned above.
k6 run ./test/load/full-api-test.jsIf you are feeling more ambitious and don't mind spending a bit more on Lambda, API Gateway, Data Transfer, X-Ray or DSQL cost, you can increase the number of virtual users like this.
1,000 virtual users over two minutes:
k6 run ./test/load/full-api-test.js --vus 1000 --duration 2mOne option to connect to your database is to use AWS Cloudshell. For instructions on how to do this, and other connection examples, see https://docs.aws.amazon.com/aurora-dsql/latest/userguide/getting-started.html.
A helper script used by CloudShell is also provided here so you can connect using psql. It uses the AWS CLI to generate an IAM auth token for your connection.
./bin/dsql-connect --hostname DSQL_HOST --region eu-west-1 --database postgres --username adminYou'll need to replace DSQL_HOST with your cluster endpoint address. This is printed as a CloudFormation output (DsqlExampleStack.endpoint) when you run cdk deploy as instructed above.
Thanks to Alessandro Volpicella's Amazon Aurora DSQL Pricing Guide, we discovered a handy script for calculating cost based on DSQL usage, we discovered Marc Bowes' guide and script which we include here.
./fetch_dpus.sh <DSQL_CLUSTER_ID>If you want to dive deeper into this architecture, make changes and iterate quickly, you can run the API locally.
You can either run your local API against DSQL or a local PostgreSQL
by setting DB_ENDPOINT to localhost or your cluster endpoint domain name.
npx tsx ./packages/api/api.ts