A comprehensive Event Sourcing implementation using PostgreSQL as the event store, with Kafka for event-driven architecture and Docker for containerization.
This solution implements a complete event sourcing pattern and event-driven architecture with the following components:
- EventSourcing - Core event sourcing logic and interfaces, including aggregate roots, domain events, and event store abstractions
- EventSourcing.Postgres - PostgreSQL-based event store implementation
- EventBus - Core event bus logic and interfaces, including event consumers, event handlers, and integration event definitions
- EventBus.Kafka - Kafka-based event bus for reliable message delivery and processing
- ESsample.Banking.API - Web API that handles banking operations and publishes integration events. Features are created using Vertical Slice Architecture and CQRS pattern
- ESsample.Banking.Shared - Shared contracts and events between producer and consumers
- Projections.Banking - A sample projection logic to build read models from the different integration events consumed
- Projections.Banking.API - Web API to query the read models built by the consumer
- Projections.Banking.Consumer - Background service that consumes and processes integration events in order to update the read models
- Projections.Bankisg.Postgres - PostgreSQL implementation for read models storage
- PostgreSQL - Event store database
- Entity Framework Core - ORM for database interactions
- Automated Migrations - Database schema management using Entity Framework Core migrations
- Apache Kafka - Message broker for event streaming
- Kafka UI - Web interface for monitoring Kafka topics and messages
- Zookeeper - Coordination service for Kafka
- Docker - Containerization platform
- Docker Compose - Orchestration tool for the multi-container Docker application
- Health Checks - Built-in health checks for services to monitor their status
- Logging - Structured logging using Serilog for better observability
- Swagger/OpenAPI - API documentation and testing interface
- Razor Pages - Simple web front-end for user interactions
- xUnit - Unit testing framework for .NET
- Moq - Mocking library for unit tests
- Testcontainers - Integration testing with real dependencies in Docker containers
- Event Sourcing - Storing state changes as a sequence of events
- CQRS (Command Query Responsibility Segregation) - Separate models for reading and writing data
- Domain-Driven Design (DDD) - Structuring the code around the business domain
- Vertical Slice Architecture - Organizing code by feature rather than by layer
- Event-Driven Architecture - Decoupling components through events
- Dependency Injection - Managing dependencies through constructor injection
- Asynchronous Programming - Using async/await for non-blocking operations
- Docker and Docker Compose
- .NET 8.0 SDK (for local development)
- Git
- pgAdmin or any PostgreSQL client (optional, for database inspection)
-
Clone the repository
git clone https://github.com/jarDotNet/EDA-ES cd EDA-ES -
Start all services
docker-compose up -d --build
-
Verify services are running
docker-compose ps
-
Access the services
- Event Sourcing Banking Web service: http://localhost:8082
- Banking Projections Web service: http://localhost:8084
- Kafka UI: http://localhost:8080
-
Start infrastructure services
docker-compose up -d postgres kafka zookeeper kafka-ui
-
Run the Event Sourcing Banking API
cd ESsample.Banking.API dotnet run -
Run the Consumer (in another terminal)
cd Projections.Banking.Consumer dotnet run -
Run theBanking Projections API
cd Projections.Banking.API dotnet run
- POST
/api/accounts- Create a new account - GET
/api/accounts/{id}- Get account details - GET
/api/accounts/{id}/history- Get account event history - POST
/api/accounts/{id}/deposit- Deposit money - POST
/api/accounts/{id}/withdraw- Withdraw money - POST
/api/accounts/{id}/transfer- Money transfer between accounts
Create Account:
curl -X POST "http://localhost:8082/api/accounts" \\
-H "Content-Type: application/json" \\
-d '{
"accountNumber": "ACC001",
"accountName": "My savings account",
"ownerName": "John Doe",
"initialBalance": 1000.00
}'Deposit Money:
curl -X POST "http://localhost:8082/api/accounts/{account-id}/deposit" \\
-H "Content-Type: application/json" \\
-d '{
"amount": 500.00,
"description": "Salary deposit"
}'Access Swagger UI at http://localhost:8082/swagger to explore and test the API endpoints.
Access the simple web front-end at http://localhost:8082 to visit the home page and interact with the banking features, including Swagger UI, Account History, and Account Balance web pages.
- GET
/api/reports/balances- List all account balancess from the reading model - GET
/api/reports/transactions/{id}- List all transactions for the specified account in the reading model
Access Swagger UI at http://localhost:8084/swagger to explore and test the API endpoints.
Access the simple web front-end at http://localhost:8084 to visit the home page and interact with the banking projections features, including Swagger UI, Account Balances, and Account Transactions web pages.
- Command Processing: Event Sourcing Banking API receives commands (create account, deposit, withdraw, transfer)
- Event Generation: Domain events are generated and stored in the event store
- Integration Events: Integration events are published to Kafka event bus
- Event Consumption: Consumer service processes integration events
- Side Effects: Projections Consumer triggers notifications, updates read models, etc.
- Read Models: Projections API queries the read models built by the consumer and lists account balances and transactions
- AccountCreatedEvent - Published when a new account is created
- AccountBalanceUpdatedEvent - Published when account balance changes ( deposits, withdrawals or transfers)
| Service | Port | Description |
|---|---|---|
| banking-producer | 8082 | Banking API (Event Sourcing) |
| banking-consumer | - | Integration Events consumer service |
| banking-projections | 8084 | Projections API (reading models) |
| banking-postgres-db | 5432 | PostgreSQL host for event storage and reading models databases |
| kafka | 9092 | Kafka message broker |
| kafka-ui | 8080 | Kafka monitoring UI |
| zookeeper | 2181 | Kafka coordination service |
Access Kafka UI at http://localhost:8080 to:
- View topics and messages
- Monitor consumer groups
- Check message throughput
- Debug event processing
View service logs:
# All services
docker-compose logs -f
# Specific service
docker-compose logs -f banking-consumer
docker-compose logs -f banking-producer-api
docker-compose logs -f banking-projections-apiBanking Producer (ESsample.Banking.API):
ConnectionStrings__EventStore- PostgreSQL connection string for event storeKafka__BootstrapServers- Kafka bootstrap serversKafka__TopicPrefix- Kafka topic prefix for integration events
Banking Consumer: (Projections.Banking.Consumer)
ConnectionStrings__ReadingModels- PostgreSQL connection string for ethe reading models databaseKafka__BootstrapServers- Kafka bootstrap serversKafka__TopicPrefix- Topic prefix for integration events
Banking Projections: (Projections.Banking.API)
ConnectionStrings__ReadingModels- PostgreSQL connection string for ethe reading models database
This solution connects to external PostgreSQL databases instead of running its own instance:
- Database Server: 127.0.0.1:5432
- Database Name: eventstore
- Connection String: Host=banking-postgres-db;Port=5432;Database=eventstore;Username=postgres;Password=balrog
- Database Server: 127.0.0.1:5432
- Database Name: banking
- Connection String: Host=banking-postgres-db;Port=5432;Database=banking;Username=postgres;Password=balrog
Before running the application, ensure your external PostgreSQL databases have the required schemas. You can run the migration manually:
# Connect to your PostgreSQL instance
psql -h 127.0.0.1 -p 5432 -U postgres -d eventstore
# The application will automatically create the required tables on first run
# Or you can run the migration manually using Entity Framework:
dotnet ef database update --project EventSourcing.Postgres --startup-project ESsample.Banking.APIThe event store uses a single table mt_events with the following structure:
seq_id- Auto-incrementing sequence IDid- Event ID (UUID)stream_id- Aggregate ID (UUID)version- Event version within the streamdata- Event payload (JSONB)type- Event type nametimestamp- Event creation timestamptenant_id- Tenant identifier for multi-tenancy supportmt_dotnet_type- .NET type information for deserializationis_archived- Flag for soft deletion
# Connect to your PostgreSQL instance
psql -h 127.0.0.1 -p 5432 -U postgres -d banking
# The application will automatically create the required tables on first run
# Or you can run the migration manually using Entity Framework:
dotnet ef database update --project Projections.Banking.Postgres --startup-project Projections.Banking.APIThe reading models database contains two tables:
Balances- Stores current balances for each accountTransactions- Stores transaction history for each account
The Balances table structure is as follows:
AccountId- Primary key (UUID)AccountNumber- Unique account numberAccountName- Name of the accountOwnerName- Name of the account ownerCurrentBalance- Current account balanceLastUpdated- Timestamp of the last update
And for the Transactions table:
TransactionId- Primary key (UUID)AccountId- Foreign key to the account (UUID)AccountNumber- Account numberAccountName- Name of the accountAmount- Transaction amountOpeningBalance- Balance before the transactionClosingBalance- Balance after the transactionDescription- Transaction descriptionDate- Transaction timestamp
The Docker containers are configured to access the external database using:
- extra_hosts mapping for DNS resolution
- Direct IP connection to 127.0.0.1:5432
- Network bridge allowing external connectivity
- Create an account using the API
- Check Kafka UI to verify the existence of the
banking-eventstopic - Click on the Messages tab of the topic to see the
AccountCreatedEventevent - Check consumer logs to see event processing
- Perform deposits/withdrawals/transfers and observe the event flow
- Query the Projections API to see updated balances and transactions
- Use Swagger UI to interact with the APIs
- Use Razor Pages for a simple web interface
- Check logs for detailed information
- Use Kafka UI to monitor topics and consumer groups
- Inspect databases using pgAdmin or any PostgreSQL client
- Run unit tests using
dotnet testin the respective test project directories - Run integration tests using
dotnet testin the respective integration test project directories
- Event Sourcing Banking API: http://localhost:8082/health
- Projections Banking API: http://localhost:8082/health
- Kafka UI: http://localhost:8080
EDA-ES/
βββ ESsample.Banking.API/ # Banking API (Event Sourcing)
βββ ESsample.Banking.Shared/ # Shared contracts and integrations events
βββ EventBus/ # Event bus abstractions
βββ EventBus.Kafka/ # Kafka event bus implementation
βββ EventBus.UnitTests/ # Unit tests for EventBus
βββ EventSourcing/ # Core event sourcing abstractions
βββ EventSourcing.Postgres/ # PostgreSQL event store implementation
βββ EventSourcing.Postgres.IntegrationTests/ # Integration tests for PostgreSQL event store
βββ EventSourcing.UnitTests/ # Unit tests for EventSourcing
βββ Projections.Banking/ # Banking projections logic and use cases
βββ Projections.Banking.API/ # Banking projections API (reading models)
βββ Projections.Banking.Consumer/ # Integration Events consumer service (reading models)
βββ Projections.Banking.Consumer.IntegrationTests/ # Integration tests for the consumer service
βββ Projections.Banking.Postgres/ # PostgreSQL implementation for reading models storage
βββ Directory.Build.props # Shared build settings
βββ docker-compose.yml # Docker services configuration file
βββ EDA-ES.sln # Visual Studio solution file
βββ README.md # This file
- Define the event in
ESsample.Banking.Shared/Events/ - Publish the event in the Event Store Banking API
- Create a handler in
Projections.Banking.Consumer/Handlers/ - Register the handler (automatic via reflection)
The consumer automatically discovers and registers event handlers using reflection. Simply implement IIntegrationEventHandler<TEvent> and the framework will handle the rest.
Kafka Connection Issues:
- Ensure Kafka is healthy:
docker-compose ps - Check Kafka logs:
docker-compose logs kafka - Verify network connectivity between services
Database Connection Issues:
- Check PostgreSQL health:
docker-compose ps - Verify connection string configuration
- Check database logs:
docker-compose logs postgres
Consumer Not Processing Events:
- Check consumer logs:
docker-compose logs banking-consumer - Verify Kafka topic exists in Kafka UI
- Check consumer group status in Kafka UI
# Restart all services
docker-compose restart
# Rebuild and restart specific service
docker-compose up -d --build banking-consumer
# View real-time logs
docker-compose logs -f banking-consumer banking-producer-api
# Clean up everything
docker-compose down -v
docker system prune -f- Events are the source of truth
- Current state is derived from events
- Complete audit trail of all changes
- Temporal queries and time travel
- Separate models for reads and writes
- Commands modify state via events
- Queries read from projections/read models
- Loose coupling between services
- Asynchronous processing
- Scalable and resilient system design
- Integration via events
- Build read models from events
- Optimized for querying
- Materialized views of event data
- Event handlers update projections
- Supports multiple read models
- Event replay to rebuild projections
- Focus on the business domain
- Rich domain models
- Ubiquitous language
- Bounded contexts
- Aggregates and entities
- Value objects
- Domain events
- Repositories
- Organize code by feature
- Self-contained modules
- Reduced dependencies
- Easier to understand and maintain
- Focused testing
- Enhanced modularity
- Improved collaboration
This project is licensed under the MIT License - see the LICENSE file for details.