Practice DevOps with custom Pipes reusing CI/CD logic

In February 2019, we released Bitbucket Pipes to allow DevOps teams build more powerful and automated CI/CD workflows in a plug and play way. Pipes was announced with several industry leaders including Microsoft, AWS, Slack, Sonar and more to help automate CI/CD use cases with Bitbucket Pipelines such as deploying to AWS S3 or sending a notification to Slack.

In addition to this, we released custom Pipes, a simple way for DevOps teams to write their own pipes to help abstract any custom configuration across all repositories.

This feature is currently helping teams to:

  • Improve dev speed. DevOps teams can write and standardize any custom logic (deployment scripts, testing, notification and alerts, etc.) that development teams use across repositories, allowing them to focus on what it matters at the end of the day: delivering customer value.
  • Reduce duplication. Pipes are developed and maintained in a single repository and reused across many different repositories.
  • Simplify build configuration. Pipes can be updated and maintained by DevOps teams, which make it trivial for development teams to set up external services across pipelines and repositories, reducing a lot the configuration needed and simplifying the maintenance.
  • Versioning of build scripts. Pipes allow DevOps teams to introduce new versions of build scripts reducing considerably the risk of breaking existing builds.

How to create a simple “Hello World” custom pipe using Python

Step 1: Create a repository.

The first thing you need to do is to create a new repository with the directory structure looking like this:

- bitbucket-pipelines.yml    # CI/CD configuration
- Dockerfile                 # pipe runs inside a Docker container
- pipe.py                    # script to automate
- requirements.txt           # Python dependencies for the pipe

You can also clone the Python demo pipe repository, which contains all the source code explained in this article.

Please note: this example uses Python. If you use other languages such as Bash, Go, Javascript… the required files will differ from the ones used in this article.

Step 2: Write the script.

Now, you will need to write the script that automates your CI/CD task. Pipes are just docker containers that get executed with a specific runtime, so you can use any language to write your pipe. We provide toolkits for Bash and Python to help with parameter validation, syntax coloring, debugging options, etc.

File: pipe.py

from bitbucket_pipes_toolkit import Pipe

variables = {
  'NAME': {'type': 'string', 'required': True},
  'DEBUG': {'type': 'boolean', 'required': False, 'default': False}
}

pipe = Pipe(schema=variables)
name = pipe.get_variable('NAME')

pipe.log_info("Executing the pipe...")
pipe.log_info(f"Hello, {name}")

pipe.success(message="Success!")

File: requirements.txt

bitbucket-pipes-toolkit==1.6.4

Step 3: Build and test the pipe.

Once the script is written, you need to create a docker container with all required dependencies and add the script to it. It’s really important that you keep the docker image as lightweight as possible as it can affect the build time. Then, the entry point should be configured as the script pipe.py so that it executes when the docker container starts.

File: Dockerfile

FROM python:3.7-slim 

COPY pipe.py requirements.txt 
RUN pip install -r requirements.txt 

ENTRYPOINT ["python3", "pipe.py"]

Build and test your pipe in local

You can use this command to test your pipe in your local machine before releasing it.

docker run --rm \
   -e NAME="Raul" \
   -v $(pwd):$(pwd) \
   -w $(pwd) \   your-dockerhub-account/your-pipe-name

Step 4: Release your pipe

Once you have built and wrapped your pipe script in a docker container, you can push it to a public docker registry.

If you are testing it in local, you can push your docker image to DockerHub, for example, with the following commands:

docker login --username "your-dockerhub-account" 
docker push your-dockerhub-account/your-pipe-name

However, we strongly recommend to automate your build and release process using Bitbucket Pipelines, so that every time you change your code, it gets deployed and your repository is properly tagged with the version number. This is an example of how you can easily do it:

File: bitbucket-pipelines.yml

image: atlassian/default-image:2

pipelines:
  default:
  - step:
      name: Build and Push
      deployment: production
      script:
        # Build an push image
        - VERSION="1.$BITBUCKET_BUILD_NUMBER"
        - IMAGE="$DOCKERHUB_USERNAME/$BITBUCKET_REPO_SLUG"
        - docker login --username "$DOCKERHUB_USERNAME" --password "${DOCKERHUB_PASSWORD}"
        - docker image build -t ${IMAGE}:${VERSION} .
        - docker image tag ${IMAGE}:${VERSION} ${IMAGE}:latest
        - docker image push ${IMAGE}
        # Push tags
        - git tag -a "${VERSION}" -m "Tagging for release ${VERSION}"
        - git push origin ${VERSION}
      services:
        - docker

Please note: at the moment we don’t officially support using private pipes, so custom pipes need to be published to public docker registries such as Docker Hub. We have created a feature request to support private pipes to gauge the interest in it.

Step 5: Use your pipe

Finally, you can use your pipe across multiple repositories.

image: atlassian/default-image:2

pipelines:
  default:
  - step:
      name: Test pipe
      script:
        - echo "I created my first pipe" 
        - pipe: docker://your-dockerhub-account/your-pipe-name:1.1
          variables:
            NAME: "Raul"

For more details about how to write a pipe, you can check the official documentation.

Looking to upgrade your Bitbucket Cloud plan?