Proof of Concept demonstrating how to Configure Terraform, Create a Remote Backend, Create your CI/CD Infrastructure and Perform Deployments from your SCM into AWS via CodePipeline & CodeCommit
These Terraform scripts will allow you to build the needed resources to store your Terraform State Files in a versioned, durable backend (AWS S3) as well as provide State Locking via a Manged NoSQL Database (AWS DynamoDB). This allows you to share your Terraform States, as well as lock the state to prevent unapproved changes or overwrites to your codified infrastructure. The sub-repo's will allow you to build out the backend infrastructure locally, then initiate the new backend and migrate your local *.tfstate into a path and file you specifiy. Following successful setup of your S3 Backend, you can then create and deploy your codified infrastructure needed for CI/CD via AWS CodePipeline, CodeBuild and a SCM of your choosing (Github or CodeCommit)
The following steps are done to mirror how you would install Terraform on Ubuntu 18.04 LTS on an EC2 Instance, Ensure you Follow / Skip These Steps dependent on where you have / need to have Terraform installed
- Update Your Instance
sudo apt-get update && sudo apt-get upgrade -y
- Install Unzip
sudo apt-get install unzip
- Grab the Latest Version of Terraform (https://www.terraform.io/downloads.html)
wget https://releases.hashicorp.com/terraform/0.11.13/terraform_0.11.13_linux_amd64.zip
- Unzip Terraform Installation
unzip terraform_0.11.13_linux_amd64.zip
- Move to /local/bin (or otherwise add it to your PATH)
sudo mv terraform /usr/local/bin/
- Ensure that Terraform is Installed Correctly
terraform --version
- Ensure your EC2 Instances have an Instance Profile that allows permissions for AT LEAST S3 and DynamoDB (https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch-role-ec2_instance-profiles.html)
- Clone This Repository
git clone https://github.com/jonrau1/AWS-CodePipeline-TerraformCICD-Workshop.git
- Navigate to the Remotes Directory
cd remotes
- Enter in your Region into your Provider, without an access key / secret key defined Terraform will call EC2 Metadata for temporary credentials, provided you have an EC2 Instance Profile with proper permissions attached to your Instance (https://www.terraform.io/docs/providers/aws/index.html)
nano provider.tf
- Enter a Unique DNS Compliant Name for your S3 Bucket && A Unique Name for your DynamoDB Table
nano remotes.tf
- Initialize Terraform & Download AWS Provider
terraform init
- Create a Terraform Execution Plan (https://www.terraform.io/docs/commands/plan.html)
terraform plan
- Apply the Terraform Execution Plan (https://www.terraform.io/docs/commands/apply.html)
terraform apply
- Retrive the
terraform.tf
file from thes3 backend template
and copy it to yourremotes
folder - Fill out the
terraform.tf
file to include the name of your S3 Bucket, DynamoDB Table and whatever folder path and naming convention you need your State File to follownano terraform.tf
- Reinitialize Terraform, this will copy your current State to your S3 Backend (https://www.terraform.io/docs/backends/types/s3.html)
terraform init
- Delete the local tfstate.* files after confirming they are in your S3 Backend
rm terraform.tfstate && terraform.tfstate.backup