This blog is part of 12 days of CI/CD, a celebration of all things CI/CD. Click here for more content and stay up to date by following us on Twitter!
This post was written by Oleksandr Kyrdan, a Developer at SoftServe.
Infrastructure as Code (IaC) gives teams the ability to manage their infrastructure using configuration or definition files. It operates similar to the way source code does, taking advantage of versioning and, when executed, results in the same infrastructure generated every time. IaC helps solve problems teams face with their development workflow by helping reduce cost, improve speed, and mitigate risk when it comes to testing and deploying code.
We decided to use Infrastructure as Code when building the AWS Elastic Beanstalk pipe. Using IaC would allow us to automate provisioning and keep track of infrastructure for development and testing in a consistent and reliable way.
Using Infrastructure as Code for testing use cases
For the AWS Elastic Beanstalk pipe we automate the infrastructure set up on every test execution. We did this by using the AWS CloudFormation pipe to deploy all infrastructure changes before running our tests.
Each infrastructure definition (or template) lives in the same Bitbucket repository as the source code, giving the team full version control over infrastructure. This has the benefit of providing a full history of changes made and, because the infrastructure definition lives alongside our code, let us successfully execute any past build and be sure it still worked.
Here's how we did it:
Step 1: Set up the AWS account to use AWS CloudFormation
To automate our test infrastructure we first had to configure an IAM user with sufficient permissions so that the AWS CloudFormation pipe could provision the infrastructure required for the tests.
Step 2: Define and build the infrastructure definition template
Next we modeled and built our infrastructure definition using CloudFormation's YAML syntax. In the template we declare the AWS resources needed to create and configure our integration tests. For more information about the template check out this link on template anatomy.
AWS already provides template snippets for most of their services, so we used the CloudFormation template for Elastic Beanstalk (YAML or JSON) and modified it for our specific needs.
AWSTemplateFormatVersion: '2010-09-09'
Parameters:
ApplicationName:
Type: String
EnvironmentName:
Type: String
Resources:
sampleApplication:
Type: AWS::ElasticBeanstalk::Application
Properties:
ApplicationName: !Ref ApplicationName
Description: AWS Elastic Beanstalk Sample Application
sampleApplicationVersion:
Type: AWS::ElasticBeanstalk::ApplicationVersion
Properties:
ApplicationName:
Ref: sampleApplication
Description: AWS ElasticBeanstalk Sample Application Version
SourceBundle:
S3Bucket: bbci-pipes-test-infrastructure-us-east-1
S3Key: ebs-nodejs-newsample-app.zip
sampleConfigurationTemplate:
Type: AWS::ElasticBeanstalk::ConfigurationTemplate
Properties:
ApplicationName:
Ref: sampleApplication
Description: AWS ElasticBeanstalk Sample Configuration Template
OptionSettings:
- Namespace: aws:autoscaling:asg
OptionName: MinSize
Value: '1'
- Namespace: aws:autoscaling:asg
OptionName: MaxSize
Value: '2'
- Namespace: aws:elasticbeanstalk:environment
OptionName: EnvironmentType
Value: LoadBalanced
SolutionStackName: 64bit Amazon Linux 2018.03 v4.10.2 running Node.js
sampleEnvironment:
Type: AWS::ElasticBeanstalk::Environment
Properties:
EnvironmentName: !Ref EnvironmentName
ApplicationName:
Ref: sampleApplication
Description: AWS ElasticBeanstalk Sample Environment
TemplateName:
Ref: sampleConfigurationTemplate
VersionLabel:
Ref: sampleApplicationVersion
You can also find the modified template in the Elastic Beanstalk pipe repository here.
Step 3: Configure Pipelines to set up infrastructure and run tests on every build
For every build we wanted Pipelines to set up our infrastructure and run tests. To do this we added a step to our pipeline called Setup testing resources using the aws-cloudformation-deploy pipe.
setup: &setup
step:
name: Setup testing resources
script:
- pipe: atlassian/aws-cloudformation-deploy:0.5.0
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: "us-east-1"
STACK_NAME: "bbci-pipes-test-infrastructure-ebs-nodejs-${BITBUCKET_BUILD_NUMBER}"
TEMPLATE: "./test/CloudFormationStackTemplate_nodejs.yml"
CAPABILITIES: ['CAPABILITY_IAM']
WAIT: 'true'
STACK_PARAMETERS: >
[{
"ParameterKey": "ApplicationName",
"ParameterValue": "bbci-pipes-test-infrastructure-${BITBUCKET_BUILD_NUMBER}"
},
{
"ParameterKey": "EnvironmentName",
"ParameterValue": "master-${BITBUCKET_BUILD_NUMBER}"
}]
test: &test
step:
name: Test
image: python:3.7
script:
- apt-get update && apt-get install zip openjdk-11-jdk -y
- pip install -r test/requirements.txt
- pytest --verbose test/test.py --junitxml=test-reports/report.xml
after-script:
# delete the stack
- STACK_NAME="bbci-pipes-test-infrastructure-ebs-nodejs-${BITBUCKET_BUILD_NUMBER}"
- pip install awscli
- aws --region "us-east-1" cloudformation delete-stack --stack-name ${STACK_NAME}
services:
- docker
caches:
- pip
pipelines:
default:
- <<: *setup
- <<: *test
branches:
master:
- <<: *setup
- <<: *test
Finally, we wanted our tests to be reproducible and idempotent, so we used the after-script feature in Pipelines to clean up the infrastructure resources used after our tests had run.
after-script:
# delete the stack
- STACK_NAME="bbci-pipes-test-infrastructure-ebs-nodejs-${BITBUCKET_BUILD_NUMBER}"
- pip install awscli
- aws --region "us-east-1" cloudformation delete-stack --stack-name ${STACK_NAME}
As a result of utilizing Infrastructure as Code in the manner above, we've seen several benefits as part of our software development lifecycle:
- Reduced costs: Test infrastructure is set up on demand, helping reduce costs as it is only running when we need it.
- Reduced maintenance: Because we don't need to manually set up our infrastructure resources anymore there is no need to maintain and document the infrastructure required. And because the application and infrastructure code live in the same repository, it made it easier to improve and evolve it.
- Reduce risk: As infrastructure changes get reviewed using code review just like our application code, it’s much easier to identify risks such as security vulnerabilities or errors when configuring the infrastructure.
The AWS CloudFormation pipe and Bitbucket Pipelines allows us to make the most of Infrastructure as Code, automating our infrastructure in an on demand way and enabling us to build and test our application with a minimum of fuss and effort. Try it out yourself today!