Updates to Bitbucket Pipes

This blog is part of 12 days of CI/CD, a celebration of all things CI/CD. Click here for more content and stay up to date by following us on Twitter!

Today's high performing teams rely on best of breed tools to help them deliver software to their customers. Integrating these tools together is important as ever, and we announced Bitbucket Pipes earlier this year to help teams easily build powerful, automated CI/CD workflows in a plug and play fashion. We partnered with industry leaders to make integrating the tools you know and love as easy as possible, and a big reason why almost 6 million build minutes are run in Bitbucket Pipelines every week is down to the great adoption of pipes since its launch.

We've made numerous improvements to Pipes since launch, almost doubling the number of supported pipes available, making it easier to automate your CI/CD pipelines, and simplifying how to create your own custom pipes to meet your specific workflow needs.

New Pipes

There are now 59 pipes available for teams to build and automate their CI/CD pipeline, almost double the number available at launch. These pipes cover a broad range of categories to meet the needs of any team from deployment through to artifact management, security, and monitoring. And because these supported pipes are updated and maintained by the author you never have to worry about updating or re-configuring them yourself. Check back for a separate blog about these new pipes coming soon.

Discoverability

We made it easy to browse pipes with a searchable, full screen view when editing your .yml file and a searchable directory of pipes outside of the product. Now it's simpler than ever to find the pipes you need and to discover more that meet your specific needs and improve your workflow.

Array types

If you need to use a variable containing a list of values (e.g. a list of extra arguments) you can use the new array type and define it directly in the pipe definition. This will prevent issues with quotes and escaping characters in pipes. Find out more details here.

Example:

pipe: myaccount/myrepo:1.0
variables:
  STRING_VAR: 'my string'
  EXTRA_ARGS: ['--description', 'text containing spaces', '--verbose']

Default variables

On occasions passing variables to pipes can be repetitive, especially for credentials, as they need to be explicitly referenced each time.

Before:

- pipe: sonarsource/sonarcloud-quality-gate:0.1.3
  variables:
    SONAR_TOKEN: 'XXXXX'

After:

- pipe: sonarsource/sonarcloud-quality-gate:0.1.3

If the repository variable SONAR_TOKEN is already in the context, they will be automatically passed to the pipe context. Pipe authors simply need to define the default variables section in the pipe.yml when writing the pipe.

Example pipe.yml:

name: SonarCloud Quality Gate check
description: Check if a project / analysis passed the quality gate check
image: sonarsource/sonarcloud-quality-gate:0.1.3
variables:
  - name: SONAR_TOKEN
    default: '${SONAR_TOKEN}'

The full example can be found here.

Passing information between pipes

If you want pipes to be able to share state or other information between pipes we have added two variables that exist for the duration of a pipeline:

  • BITBUCKET_PIPE_STORAGE_DIR  Use this directory to pass information or artifacts between subsequent runs of your pipe.
  • BITBUCKET_PIPE_SHARED_STORAGE_DIR If you need to get information from another pipe, you can read it from this directory.

So, if you want to store data for your own pipe’s consumption in a later step in the pipeline, you could write a script that looks like this:

#!/bin/sh 
set -e 
echo "{'key': 'value'}" >> $BITBUCKET_PIPE_STORAGE_DIR/<filename>

Then to read this data back in a later step:

#!/bin/sh 
set -e 
cat $BITBUCKET_PIPE_STORAGE_DIR/<filename>

You can also read data from another pipe. To do this you need to know the key for the pipe you want to use, and the account that owns it.

#!/bin/sh
set -e 
cat $BITBUCKET_PIPE_SHARED_STORAGE_DIR/<pipeAccount>/<pipeName>/metadata.json

For example: To access the aws-lambda-deploy-env file from the aws-lambda-deploy pipe you would write:

#!/bin/sh
set -e 
cat $BITBUCKET_PIPE_SHARED_STORAGE_DIR/atlassian/aws-lambda-deploy/aws-lamda-deploy-env

We’re making it easier than ever to integrate the tools you know, love, and depend upon in your CI/CD workflow, and the best is yet to come. Have any feature requests or suggestions? Let us know!