How to use CodePipeline CICD pipeline to test Terraform

Ramy Nassef

Ramy Nassef

5
(6)

Terraform allows you to manage your cloud resources by defining them using the “infrastructure as code” way. Like with programming languages, it is very important to test Terraform modules to make sure they contain no syntax errors, security issues and they compliant with your organization’s policies. In this article, we’ll show you how to use AWS CodePipeline and AWS CodeBuild to build a CICD pipeline that runs using tflint, Checkov, Open Policy Agent (OPA), Terratest, Terrascan, and Infracost to test your Terraform code.

Prerequisites

In this article we’re using the following technology stack:

  • AWS CodeBuild
  • AWS CodeCommit
  • AWS CodePipeline
  • Terraform v1.0.6
  • TFLint v0.33.0
  • Checkov (latest)
  • Terrascan v1.9.0
  • Terratest v0.38.2

In addition to that, you can find all demo projects source code in our GitHub repositories:

Terraform Testing

On most of the projects, Cloud Engineers or Developers start with the following tests:

  • Static code anaylsis to analyze code for programmatic or syntax errors
  • Compliance checks to ensure the configuration follows the policies you’ve defined for the project
  • Integration testing to test that their feature works well with the application as a whole
  • Functional tests allows you to validate that Terraform module can be deployed without issues with specified sets of parameters

Static Code Analysis

Often you can find minor issues that you may have missed when eyeballing your code. Static code analysis is a technique by which we can check defects in the code without actually executing it. Static code analysis commonly refers to the running of static code analysis tools that highlight possible vulnerabilities while parsing source code and checking it using different techniques.

The benefits of static code analysis are:

  • Fast
  • Stable
  • No need to deploy resources
  • Very easy to use

There are several ways to do a static code analysis for Terraform. We’ll cover them one by one below.

Terraform validate

Terraform has a built-in feature that allows you to automatically format and validate your code by using the terraform validate command. This command allows you to check that you have not made trivial syntax mistakes, missed a bracket, or you did not leave any unused variables.

It’s highly recommended to run the terraform fmt command as well, to format Terraform files to a canonical format and style.

All you will need to do here is run terraform init, then terraform validate command in your code directory.

Using terraform validate command

TFLint

TFLint is a Terraform linting framework with lots of interesting features. The key features are as follows:

  • Find possible errors (like illegal instance types, for example) for major cloud providers (AWS, Azure, GCP, etc)
  • Warn about deprecated syntax and unused declarations
  • Enforce best practices and naming conventions

Here’s an example of running tflint --init and tflint in the code directory.

The error caused by the TF deprecated expression since TF V0.12.

2. How to use CodePipeline CICD pipeline to test Terraform - Example of failed tflint test

If the check is executed in the CICD pipeline, you’ll need to fix that issue to have a successful pipeline step execution:

3. How to use CodePipeline CICD pipeline to test Terraform - Example of successful tflint test

Here’s another example, that shows a failed output of tflint, which can’t be detected by the terraform validate command.

Here we intentionally requested the wrong type of AWS EC2 instance. The terraform validate check will not be able to detect it, while TFLint will.

4. How to use CodePipeline CICD pipeline to test Terraform - tflint_output

Let’s put both the terraform validate and tflint commands in a buildspec-tflint.yml file to allow AWS CodeBuild to check code quality. We’ll use it as a stage in the AWS CodePipeline later.

version: 0.2

env:
  variables:
    TF_VERSION: "1.0.6"

phases:

  install:
    commands:
      - cd /usr/bin
      - "curl -s -qL -o terraform.zip https://releases.hashicorp.com/terraform/${TF_VERSION}/terraform_${TF_VERSION}_linux_amd64.zip"
      - unzip -o terraform.zip
      - "curl --location https://github.com/terraform-linters/tflint/releases/download/v0.33.0/tflint_linux_amd64.zip --output tflint_linux_amd64.zip"
      - unzip -o tflint_linux_amd64.zip

  build:
    commands:
      - cd "$CODEBUILD_SRC_DIR"
      - terraform init
      - terraform validate
      - tflint --init
      - tflint

  post_build:
    commands:
      - echo "terraform validate completed on `date`"
      - echo "tflint completed on `date`"

Checkov

Checkov is a static code analysis tool for scanning Terraform infrastructure as code (IaC) files for misconfigurations that may lead to security or compliance problems. It includes more than 750 predefined policies to check for common misconfiguration issues. Checkov supports the creation and contribution of custom policies and outputs in different formats, including JSON, or JUnit XML. It can handle variables effectively by building a graph of dynamic code dependencies.

Here’s an example of the Checkov execution for a simple S3 bucket TF project, that shows a detailed output for passed, failed, and skipped checks:

5. How to use CodePipeline CICD pipeline to test Terraform - Checkov execution example 1
6. How to use CodePipeline CICD pipeline to test Terraform - Checkov execution example 2

Let’s fix Checkov issues for the S3 bucket:

resource "aws_s3_bucket" "test" {
  bucket = var.name
  acl    = "private"

  versioning {
    enabled = true
  }

  server_side_encryption_configuration {
    rule {
      apply_server_side_encryption_by_default {
        kms_master_key_id = aws_kms_key.this.arn
        sse_algorithm     = "aws:kms"
      }
    }
  }

  lifecycle {
    prevent_destroy = false
  }

  tags = local.common_tags
}

After enabling versioning and encrypting the S3 bucket, we have successful Checkov execution results:

7. How to use CodePipeline CICD pipeline to test Terraform - Checkov execution example 3

Checkov requirements are:

  • Python >= 3.7 (Data classes are available for Python 3.7+)
  • Terraform >= 0.12

Including Checkov in the CICD pipeline is a quick and straightforward process.

First, you have to install Checkov using Python:

pip3 install checkov

After that, we can create Terraform plan file and feed it to Checkov for validation:

checkov --directory ./ --skip-check CKV_AWS_18,CKV_AWS_144

We decided to show how you can skip some Checkov checks by adding the --skip-check CKV_AWS_18,CKV_AWS_144 argument.

Here’s an example of the buildspec-checkov.yml file for installing and running Checkov in AWS CodeBuild.

version: 0.2

env:
  variables:
    TF_VERSION: "1.0.6"

phases:

  install:
    runtime-versions:
       python: latest
    commands:
      - cd /usr/bin
      - "curl -s -qL -o terraform.zip https://releases.hashicorp.com/terraform/${TF_VERSION}/terraform_${TF_VERSION}_linux_amd64.zip"
      - unzip -o terraform.zip
      - python -V
      - pip3 install checkov

  build:
    commands:   
      - cd "$CODEBUILD_SRC_DIR"
      - checkov --directory ./ --skip-check CKV_AWS_18,CKV_AWS_144

  post_build:
    commands:
      - echo "Checkov test is completed on `date`"

Terrascan

Terrascan is a static code analyzer for Infrastructure as Code. Terrascan allows you to:

  • Seamlessly scan infrastructure as code for misconfigurations.
  • Monitor provisioned cloud infrastructure for configuration changes that introduce posture drift, and enables reverting to a secure posture.
  • Detect security vulnerabilities and compliance violations.
  • Mitigate risks before provisioning cloud native infrastructure.
  • Offers flexibility to run locally or integrate with your CI\CD.

At the moment of writing this article, the Terrascan v1.11.0 has a bug, which is not allowing you to use it in the CICD pipeline because the tool returns a non-zero exit code after its execution. So, we’ll rely on the Terrascan v1.9.0, which works fine.

To scan your Terraform project using Terrascan, you need to execute the following commands:

terrascan init
terrascan scan -i terraform

Here’s the buildspec-terrascan.yml file for AWS CodeBuild:

version: 0.2

env:
  variables:
    TF_VERSION: "1.0.6"
    TERRASCAN_VERSION: "1.9.0"

phases:

  install:
    runtime-versions:
      python: latest
    commands:
      - cd /usr/bin
      - "curl -s -qL -o terraform.zip https://releases.hashicorp.com/terraform/${TF_VERSION}/terraform_${TF_VERSION}_linux_amd64.zip"
      - unzip -o terraform.zip
      - "curl -L -o terrascan_${TERRASCAN_VERSION}_Linux_x86_64.tar.gz https://github.com/accurics/terrascan/releases/download/v${TERRASCAN_VERSION}/terrascan_${TERRASCAN_VERSION}_Linux_x86_64.tar.gz"
      - "tar -xf terrascan_${TERRASCAN_VERSION}_Linux_x86_64.tar.gz terrascan"

  build:
    commands:
      - cd "$CODEBUILD_SRC_DIR"
      - terrascan init
      - terrascan scan -i terraform

  post_build:
    commands:
      - echo "Terrascan test is completed on `date`"

Here’s information, how to skip Terrascan rules. Basically, you need to use the syntax #ts:skip=RuleID optional_comment inside a resource to skip the rule for that resource.

You can find the RuleID at JSON files in pkg/policies/opa/rego/aws subfolders of the Terrascan repository.

An example of skipping S3 bucket logging requirements rule in our demo project Terraform code:

resource "aws_s3_bucket" "test" {
  #ts:skip=AC_AWS_0497 We don't need logging for this S3 bucket
  bucket = var.name
  acl    = "private"

  versioning {
    enabled = true
  }

  server_side_encryption_configuration {
    rule {
      apply_server_side_encryption_by_default {
        kms_master_key_id = aws_kms_key.this.arn
        sse_algorithm     = "aws:kms"
      }
    }
  }

  lifecycle {
    prevent_destroy = false
  }

  tags = local.common_tags
}

Terrascan execution output:

terrascan execution status

Compliance Testing

Compliance testing is a regular part of any Terraform continuous integration process which is used to ensure that user-defined policies are following your organization’s compliance rules. For example, you can define specific naming conventions for your AWS resources. Another common example is creating virtual machines from a defined subset of images. Compliance testing is used to enforce your corporate compliance rules for all developers in your organization.

Open Policy Agent (OPA)

Open Policy Agent (OPA) is the open-source Policy as Code testing tool that helps organizations to enforce corporate standards for Terraform projects and speed up the code review process through automated compliance checks.

All OPA policies are defined in a Rego language, which allows policy authors to focus on the checks rather than on the scanning execution process.

Here’s a list of OPA benefits:

  • Policy as Code allows you to follow a standard SDLC for policies management and keep the history of policies changes in code repository
  • OPA is designed to work with any kind of JSON input, so you can easily integrate with any tool that produces JSON output
  • OPA integrates with a lots of different tools and allows you to use itself a standard policy language across different projects

Here’s how OPA scan process looks like for the Terraform project:

How to use CodePipeline CICD pipeline to test Terraform - Terraform OPA scan process

To run OPA checks against Terraform project, we need to create and save the Terraform plan:

terraform plan -out=FILENAME

Next, we need to convert the Terraform execution plan into a JSON file, so that it can be read by the OPA.

terraform show -json FILENAME > FILENAME.json

As soon as we converted Terraform plan to JSON, we need to write an OPA policy to check the plan before running the terraform apply command.

In the following example, we will check that only approved AWS resources (AWS Security Groups and S3 buckets) can be created.

Save the following OPA policy in the file named terraform.rego:

package terraform
 
import input as tfplan
 
# Allowed Terraform resources
allowed_resources = [
	"aws_security_group",
	"aws_s3_bucket"
]
 
 
array_contains(arr, elem) {
	arr[_] = elem
}
 
deny[reason] {
  resource := tfplan.resource_changes[_]
  action := resource.change.actions[count(resource.change.actions) - 1]
  array_contains(["create", "update"], action)  # allow destroy action

  not array_contains(allowed_resources, resource.type)

  reason := sprintf(
    "%s: resource type %q is not allowed",
    [resource.address, resource.type]
  )
}

You can find lots of interesting OPA checks examples in the Scalr/sample-tf-opa-policies GitHub repository.

To check the Terraform plan for any compliance issues, run the following command:

opa eval --format pretty --data FILENAME.rego --input FILENAME.json "data.terraform"

Finally, we can put everything in the buildspec-opa.yml file for AWS CodeBuild:

version: 0.2

env:
  variables:
    TF_VERSION: "1.0.6"

phases:

  install:
    commands:
      - cd /usr/bin
      - "curl -s -qL -o terraform.zip https://releases.hashicorp.com/terraform/${TF_VERSION}/terraform_${TF_VERSION}_linux_amd64.zip"
      - unzip -o terraform.zip
      - curl -L -o opa https://openpolicyagent.org/downloads/v0.32.0/opa_linux_amd64_static
      - chmod 755 ./opa

  build:
    commands:   
      - cd "$CODEBUILD_SRC_DIR"
      - terraform init -no-color
      - terraform plan -out tf.plan
      - terraform show -json tf.plan > tf.json 
      - opa eval --format pretty --data ./test/opa/terraform.rego --input tf.json "data.terraform"

  post_build:
    commands:
      - echo "OPA Test completed on `date`"

Here’s an execution output example:

8. How to use CodePipeline CICD pipeline to test Terraform - opa_tf

Functional Testing

Functional testing is a type of software testing that validates the software system against the functional requirements or specifications. The purpose of functional tests is to test the code by providing appropriate input and verifying the output against the functional requirements.

One of the common ways of doing functional testing for Terraform projects is using Terratest.

Terratest

Terratest is a Go library that provides patterns and helper functions for testing Terraform code. Basically, Terratest is a wrapper around Terraform, that allows you to implement functional testing of your infrastructure using Go as a programming language.

We highly encourage you to check an official Terratest Quick Start Guide before moving forward with this section.

Also, you can find lots of examples of using Terratest in the official Git repository.

To define a Terratest test, you need to create a new sub-directory called “test” in your repository, then create a file that ends with _test.go. This file will contain the test cases. In out example, we created a test that does the following:

  • Implement S3 Bucket with a unique name at the end, with a tag called “Automated Testing”, in AWS Region “US-East-2”
  • Pass Input Variables while the runtime, such as bucket name, environment tag
  • Run terraform init, then terraform apply
  • Verify that our S3 Bucket has versioning enabled and a policy attached
  • Locate the origin terraform files in TerraformDir
  • Run terraform show, to display the S3 Bucket Name to let you find it easily on AWS Console quickly few mins before its being destroyed by terraform destroy command
package test

import (
	"fmt"
	"strings"
	"testing"

	"github.com/gruntwork-io/terratest/modules/aws"
	"github.com/gruntwork-io/terratest/modules/random"
	"github.com/gruntwork-io/terratest/modules/terraform"
	"github.com/stretchr/testify/assert"
)

// An example of how to test the Terraform code using Terratest.
func TestTerraformAwsS3Example(t *testing.T) {
	t.Parallel()

	// Give this S3 Bucket a unique ID for a name tag so we can distinguish it from any other Buckets provisioned
	// in your AWS account
	expectedName := fmt.Sprintf("terratest-s3-hands-on-cloud-%s", strings.ToLower(random.UniqueId()))

	// Give this S3 Bucket an environment to operate as a part of for the purposes of resource tagging
	expectedEnvironment := "Automated Testing"

	// Pick a random AWS region to test in. This helps ensure your code works in all regions.
	awsRegion := "us-east-2"

	terraformOptions := &terraform.Options{
		// The path to where our Terraform code is located
		TerraformDir: "../../",

		// Variables to pass to our Terraform code using -var options
		Vars: map[string]interface{}{
			"name":        expectedName,
			"env":         expectedEnvironment,
		},

		// Environment variables to set when running Terraform
		EnvVars: map[string]string{
			"AWS_DEFAULT_REGION": awsRegion,
		},
	}

	// At the end of the test, run `terraform destroy` to clean up any resources that were created
	defer terraform.Destroy(t, terraformOptions)

	// This will run `terraform init` and `terraform apply` and fail the test if there are any errors
	terraform.InitAndApply(t, terraformOptions)

	// Run `terraform output` to get the value of an output variable
	bucketID := terraform.Output(t, terraformOptions, "bucket_id")

	// Verify that our Bucket has versioning enabled
	actualStatus := aws.GetS3BucketVersioning(t, awsRegion, bucketID)
	expectedStatus := "Enabled"
	assert.Equal(t, expectedStatus, actualStatus)

	// Verify that our Bucket has a policy attached
	aws.AssertS3BucketPolicyExists(t, awsRegion, bucketID)
}

Now, we can create the buildspec-terratest.yaml file:

version: 0.2

env:
  variables:
    TF_VERSION: "1.0.6"

phases:

  install:
    commands:
      - cd /usr/bin
      - "curl -s -qL -o terraform.zip https://releases.hashicorp.com/terraform/${TF_VERSION}/terraform_${TF_VERSION}_linux_amd64.zip"
      - unzip -o terraform.zip

  build:
    commands:   
      - cd "$CODEBUILD_SRC_DIR"
      - cd test/terratest
      - go mod init "tftest"
      - go get github.com/gruntwork-io/terratest/modules/aws
      - go get github.com/gruntwork-io/terratest/modules/terraform@v0.38.2
      - go test -v

  post_build:
    commands:
      - echo "terratest completed on `date`"

Here’s an example of Terratest execution output:

9.-How-to-use-CodePipeline-CICD-pipeline-to-test-Terraform-terratest_tf_init-1
10. How to use CodePipeline CICD pipeline to test Terraform - terratest_tf_destroy
11. How to use CodePipeline CICD pipeline to test Terraform - terratest_pass

Tracking infrastructure costs (Infracost)

As a bonus, we’ll add Infracost (GitHub repo) as a separate step to our CICD pipeline to allow you to estimate deployment infrastructure costs.

Infracost shows cloud cost estimates for infrastructure-as-code projects such as Terraform. It helps DevOps, SRE, and developers to quickly see a cost breakdown and compare different options up front.

To start using Infracost in your CICD pipeline, you have to install it locally, then you have to get a free API token by running:

infracost register

As soon as you obtained the key, you need to save it in AWS Systems Manager Parameter Store using the common prefix for SSM keys used by our TF demo project.

Infracost SSM Parameter Store key

To add Infracost to AWS CodePipeline as AWS CodeBuild step, we need to define the buildspec-infracost.yml file:

version: 0.2

env:
  variables:
    TF_VERSION: "1.0.6"
    INFRACOST_API_KEY_SSM_PARAM_NAME: "/org/hands-on-cloud/terraform/infracost_api_key"

phases:

  install:
    commands:
      - cd /usr/bin
      - "curl -s -qL -o terraform.zip https://releases.hashicorp.com/terraform/${TF_VERSION}/terraform_${TF_VERSION}_linux_amd64.zip"
      - unzip -o terraform.zip
      - apt-get update
      - apt-get -y install sudo
      - "curl -fsSL https://raw.githubusercontent.com/infracost/infracost/master/scripts/install.sh | bash"

  build:
    commands:
      - cd "$CODEBUILD_SRC_DIR"
      - 'export INFRACOST_API_KEY=$(aws --region=us-west-2 ssm get-parameter --name "${INFRACOST_API_KEY_SSM_PARAM_NAME}" --with-decryption --output text --query Parameter.Value)'
      - infracost breakdown --path .

  post_build:
    commands:
      - echo "Costs breakdown completed on `date`"

We’re using the INFRACOST_API_KEY environment variable to set up the Infracost API key from SSM Parameter Store.

For more information about AWS Systems Manager Parameter Store check out our article Introduction to AWS Systems Manager.

Here’s an Infracost execution out:

Infracost execution results

CICD pipeline example

AWS CodePipeline is a fully managed continuous delivery service that helps automate your release pipelines for fast and reliable application and infrastructure updates. In our example we’re using this service to create an end-to-end automation pipeline that fetches the application source code automatically from the AWS CodeCommit repository, runs tests, and deploys Terraform module using CodeBuild.

In this article, we’d like to combine the mentioned before Terraform Tests in AWS CodePipeline. We will create the following resources to achieve our goal.

  • CodeCommit: A fully-managed source control service that hosts secure Git-based repositories.
  • DynamoDB: Terraform will lock your state for all operations that could write state and will keep a record in DynamoDB.
  • IAM Roles: to customize fine-grained access controls to the source.
  • S3 Buckets: This solution uses an S3 bucket to store the Terraform build artifacts and state files created during the pipeline run.
How to use CodePipeline CICD pipeline to test Terraform - CICD pipeline architecture

Note: we strongly recommend you to Extend AWS CodeBuild with your Custom Build Docker image to avoid CICD pipeline flakiness and errors like:

Error: Failed to fetch GitHub releases:
GET https://api.github.com/repos/terraform-linters/tflint-ruleset-aws/releases/tags/v0.5.0:
403 API rate limit exceeded for 52.43.76.91. (But here's the good news:
Authenticated requests get a higher rate limit. Check out the documentation for more details.)

Configuring AWS CodePipeline using Terraform

First of all, you need to clone and push to CodeCommit repository in your AWS account the following Terraform demo project:

This is a regular Terraform module/project, which contains several buildspec*.yml files each of which is handling its own CodeBuild step in the CICD pipeline:

Demo Terraform project - Buildspec files

To set up AWS CodePipeline for testing demo Terraform project using tflint, Checkov, Open Policy Agent (OPA), Terratest, Terrascan, and Infracost you need to clone our demo repository:

This repository consists of two Terraform modules:

  • 0_remote_state – this module sets up Terraform backend infrastructure (S3 bucket for storing state files and DynamoDB table for handling Terraform execution locks); created resources through SSM Parameter Store for refference from the second module
  • 1_pipeline – this module deploys AWS CodePipeline (change variables.tf file to point pipeline to your CodeCommit repository and branch)

Follow instructions in the README.md file in every module to deploy it.

If everything has been set up correctly, you’ll be able to test and validate your Terraform modules code using a fully automated CICD pipeline.

Terraform CodePipeline execution results

Summary

In this article, we’ve covered how to automate static code analysis (terraform validate, tflint, Checkov, Terrascan), compliance checks (OPA), functional tests (Terratest), and infrastructure costs checks (Infracost) using AWS CodeCommit and AWS CodeBuild services.

How useful was this post?

Click on a star to rate it!

As you found this post useful...

Follow us on social media!

We are sorry that this post was not useful for you!

Let us improve this post!

Tell us how we can improve this post?

Top rated Udemy Courses to improve you career

Subscribe to our updates

Like this article?

Share on facebook
Share on Facebook
Share on twitter
Share on Twitter
Share on linkedin
Share on Linkdin
Share on pinterest
Share on Pinterest

Want to be an author of another post?

We’re looking for skilled technical authors for our blog!

Leave a comment

If you’d like to ask a question about the code or piece of configuration, feel free to use https://codeshare.io/ or a similar tool as Facebook comments are breaking code formatting.