This is a demo project that explores an event sourcing architecture on AWS. The use case is a personal banking website that uses event sourcing with eventual consistency to process its data. In particular it has the following features:
- It uses Amazon Managed Streaming for Apache Kafka to store events
- It uses a Java based open source library for event sourcing from Simple Machines called Simple Sourcing
- It uses Amazon Elasticsearch Service to hold a read side projection or materialized view of our events.
- It uses AWS Fargate to run the Java based docker containers for our read, write and projection services.
- It use the AWS CDK for doing both pipelines and infrastructure as code.
The following tools need to be accessible in your development environment (laptop, container or cloud9 etc).
- Java 11. Why not try Amazon Corretto..
- Gradle. How to install Gradle.
- Node + NPM (Required for CDK CLI Installation). How to install Node + NPM.
- CDK CLI. Make sure you install the correct version with
npm i -g [email protected]
- Git. Install git.
- AWS CLI (Only required for configuring access keys). Install AWS CLI.
Verify you have the required tools by typing the following at a command line:
java --version
(should be 11+)
and
gradle -version
(should be 6+)
and
cdk --version
(should be 1.31.0)
You will need access to an AWS account and a set of IAM access keys to drive programmatic access to the AWS environment from the command line. Specifically you need the access keys to allow the CDK CLI to provision resources using AWS Cloudformation.
You will need a github account if you don't have one already.
- Clone a local copy of your forked repository to your development
environment. i.e
git clone https://github.com/jousby/msk-event-sourcing-demo.git
-
Log into the AWS Console and switch over to your target region using the region selector in the top right hand corner.
-
If you don't have a set of access keys for accessing this aws environment from the command line then open up th IAM service and following these instructions to create your access keys.
-
At the moment (13/5/2020) there is one part of the demo infrastructure that can't be created via CDK/Cloudformation. In order to provision the Kafka cluster you will first need to manually create a Kafka Cluster Configuration object in the console.
Navigate to the Amazon MSK Service Homepage. Click on the navigation bar on the left of the screen if it is collapsed, then click on
Cluster Configurations
. Click on the button calledCreate cluster configuration
.On the create cluster configuration screen enter:
Configuration Name: ESDemoConfig Apache Kafka version: 2.3.1
Configuration details: (Update the first three rows in the block to look like the following)
auto.create.topics.enable=true default.replication.factor=3 min.insync.replicas=1
Hit the
Create
button to create the configuration.Back on the Cluster Configuration page click into our newly created cluster configuration. Under the configuration name there will be a Configuration ARN field. Copy this ARN value somewhere you can retrieve it later on (thanks for sticking with me :)
-
Configure your command line to use the access keys created in the previous step by running
aws configure
. Make sure the region you enter during the configuration wizard is the same as the one you were using in the console. -
If you have completed the aws configuration step successfully then when you run the cdk cli it should pick up your chosen region and access keys. Run
cdk bootstrap
to setup the cdk cli for deploying stacks in your chosen region. This boostrap process creates a cloudformation stack and some s3 buckets. -
Change directory to the base of the project folder. Build the project with:
gradle build
Check for any errors in your gradle build.
-
Within the project change directory to the
cdk-infra
folder. -
Update the
cdk.json
file with the Kafka Cluster Configuration ARN your recorded from step 4. -
Once step 9 has been completed run:
cdk deploy
This command will connect to AWS using the credentials (and implied account) you setup in step 5 and initiate the creation of a Cloudformation Stack that will provision both the infrastructure and application code required for this demo. This will take a while to complete (up to ~60 mins). Please see the next step for checking on how the creation of the Cloudformation stack is going back in the AWS Console.
-
Open up the AWS Cloudformation Service homepage with in the AWS Console. You should see a Cloudformation Stack for your the event sourcing demo in the progress of being created.
-
Once the stack has been created click on the stack in the nav bar on the left and then click on the Outputs tab. Make a copy of both the readApiUrl and the writeApiUrl values for use in a later step
-
Change directory to the
frontend
folder in the base of the project -
Update the
src/service/Endpoints.js
file with the read and write url values you recorded in step 10. -
Build the frontend project with
npm install && npm start
-
At this point you should have a local copy of the frontend running that is connected to your AWS infrastructure. Navigate to
localhost:3000
to take the demo application for a test drive.