Simple Jenkins + Packer + Terraform Pipeline for AWS EC2 using Github as SCM

Carlos Garcia
3 min readMar 30, 2020

--

Logos: Jenkins + Packer + Terraform

Most developers and Cloud engineers are very familiar with this particular stack. Nowadays, it seems to be the norm in a lot of companies.

For those of you who are new to this stack, it is mainly used for the CD portion of the CI/CD model. However, we can also use Jenkins to perform the CI portion but that’s going to be another post.

For this post, I’m assuming that…

  • You already followed the steps to create a private Repo in GitHub
  • You already created and configured a Jenkins instance running on AWS EC2 service
  • Finished adding SSH keys to both your Jenkins and Github.
  • Created a new Pipeline using Github as SCM

…all set? cool, let’s build the Jenkinsfile for our Repo!

Initial Pipeline config…

pipeline { agent any environment { REGION = 'us-east-1' //alt can be a paramENVIRONMENT = 'environment' //alt can be a paramAPPLICATION = 'server' //alt can be a paramPACKER_ACCESS_KEY="\$(aws ssm get-parameters --region \"${REGION}\" --name \"/packer/akey\" --query 'Parameters[0].Value' | tr -d '\"\')"PACKER_SECRET_KEY="\$(aws ssm get-parameters --region \"${REGION}\" --name \"/packer/skey\" --query 'Parameters[0].Value' | tr -d '\"\')"TERRAFORM_ROLE = "\$(aws ssm get-parameters --region \"${REGION}\" --name \"/terraform/role\" --query 'Parameters[0].Value' | tr -d '\"\')"TERRAFORM_BUCKET = "\$(aws ssm get-parameters --region \"${REGION}\" --name \"/terraform/bucket\" --query 'Parameters[0].Value' | tr -d '\"\')"}

NOTE: as you can see, we are referencing SSM parameters. Remember, this Jenkins instance should only be able to Read/Put the SSM parameter and Read from S3. Also, make sure you use encryption.

Build Stage

We’re ready to start adding our stages! for the sake of simplicity, we’re not going to use logic or error handling in this pipeline. Please make sure you always include error handling and notification steps for when things go south!

stages {stage ('Fetch') { //Clone Github repo to workspacesteps {checkout scm}}stage('Build') { //Build AMI based on Packer templatessteps {dir('packer') {sh "rm manifest.json -f"sh "packer.io validate -var \"region=${REGION}\" -var \"environment=${ENVIRONMENT}\" -var \"aws_access_key=${ACCESS_KEY}\" -var \"aws_secret_key=${SECRET_KEY}\" template.json"sh "packer.io build -var \"region=${REGION}\" -var \"environment=${ENVIRONMENT}\" -var \"aws_access_key=${ACCESS_KEY}\" -var \"aws_secret_key=${SECRET_KEY}\" template.json"sh "cat manifest.json | jq .builds[0].artifact_id | tr -d '\"' | cut -b 11- > .ami"script {AMI_ID = readFile('.ami').trim()}}}}

As you can see above, we are splitting the build stage into “Fetch” and “Build” which consist of checking out our repo and building a packer AMI using the template from the cloned repo.

Deploy Stage

Assuming that the build process finished successfully, we can now go ahead and deploy the AMI value we stored from the previous stage using Terraform.

stage('Deploy') {//Deploy newly created AMIsteps {dir('terraform') {sh "terraform init -backend-config=\"role_arn=${TERRAFORM_ROLE}\" -backend-config=\"bucket=${TERRAFORM_BUCKET}\""sh "terraform plan -var \"region=${REGION}\" -var \"ami_id=${AMI_ID}\" -var \"role_arn=${TERRAFORM_ROLE}\" -var \"environment=${ENVIRONMENT}\""sh "terraform apply -var \"region=${REGION}\" -var \"ami_id=${AMI_ID}\" -var \"role_arn=${TERRAFORM_ROLE}\"  -var \"environment=${ENVIRONMENT}\" --auto-approve"}}}}}

Keep in mind that we are using terraform init with parameters in order to not include sensitive information in our Repo and to make it more generic for easy deployment on other accounts!

In Conclusion…

The above is a very basic pipeline that leverages Github, Jenkins, Packer, and Terraform for AWS EC2 service deployments — It doesn’t do justice to the number of uses one can give Jenkins. The pipeline can be easily adjusted and modified to be able to deploy large-scale applications across a wide range of architectures.

--

--

Carlos Garcia
Carlos Garcia

Written by Carlos Garcia

AWS Engineer and DevOps dude. Keep it simple and to the point!

No responses yet