Issue #12750 resolved
Arjen Schwarz
created an issue

Currently it's only possible to have a single step defined for default or a branch. The ability to have multiple steps would allow the YAML file to be far more readable as the commands would be grouped by functionality. For example:

image: golang
pipelines:
  default:
    - step:
        script:
          - mkdir -p /go/src/github.com/ArjenSchwarz/aqua
          - mv * /go/src/github.com/ArjenSchwarz/aqua
          - apt-get update
          - apt-get install -y zip
          - cd /go/src/github.com/ArjenSchwarz/aqua
          - go get -t ./...
          - go test ./...
          - GOOS=linux GOARCH=amd64 go build -a -ldflags '-s' -o binary
          - curl -v -u $BB_ACCESS -X POST https://api.bitbucket.org/2.0/repositories/$BITBUCKET_REPO_OWNER/$BITBUCKET_REPO_SLUG/downloads/ -F files=@binary

Could be displayed as the below instead, which makes it a lot more obvious what everything is for.

image: golang
pipelines:
  default:
    - prepare:
        script:
          - mkdir -p /go/src/github.com/ArjenSchwarz/aqua
          - mv * /go/src/github.com/ArjenSchwarz/aqua
          - apt-get update
          - apt-get install -y zip
          - cd /go/src/github.com/ArjenSchwarz/aqua
     - test:
         script:
          - go get -t ./...
          - go test ./...
     - build:
        script:
          - GOOS=linux GOARCH=amd64 go build -a -ldflags '-s' -o binary
      - deploy:
         script:
          - curl -v -u $BB_ACCESS -X POST https://api.bitbucket.org/2.0/repositories/$BITBUCKET_REPO_OWNER/$BITBUCKET_REPO_SLUG/downloads/ -F files=@binary

Official response

  • Aneita Yang staff

    Hi everyone,

    We're excited to announce that multiple steps is now available to all users, giving you the flexibility to configure Pipelines to better suit your teams' workflow.

    You can add additional steps to your pipeline by modifying your bitbucket-pipelines.yml file to include another step keyword. Each step can be configured to:

    • Use a different Docker image.
    • Use specific caches and services.
    • Produce artifacts that subsequent steps can consume.

    For more information on how to configure multiple steps, check out our documentation.

    As multiple steps is a stepping stone for manual steps and deployments, and parallel steps, we have separate issues to track those. If you have any feedback or new suggestions for the multiple steps feature, please raise a new issue.

Comments (38)

  1. Karl Fritsche

    It should not only group stuff its also needed to use different images:

    Using example from above:

    image: golang
    pipelines:
      default:
        - prepare:
            script:
              - mkdir -p /go/src/github.com/ArjenSchwarz/aqua
              - mv * /go/src/github.com/ArjenSchwarz/aqua
              - apt-get update
              - apt-get install -y zip
              - cd /go/src/github.com/ArjenSchwarz/aqua
         - test:
             image: myTestImage # environment is set testing here
             script:
              - go get -t ./...
              - go test ./...
         - build:
            script:
              - GOOS=linux GOARCH=amd64 go build -a -ldflags '-s' -o binary
          - deploy:
             image: myDeployImage  # environment is set production here (no dev dependencies)
             script:
              - curl -v -u $BB_ACCESS -X POST https://api.bitbucket.org/2.0/repositories/$BITBUCKET_REPO_OWNER/$BITBUCKET_REPO_SLUG/downloads/ -F files=@binary
    

    Not sure, probably to complex, but defining own commands could become handy at one point.

    image: golang
    commands:
      prepare:
        script:
          - mkdir -p /go/src/github.com/ArjenSchwarz/aqua
          - mv * /go/src/github.com/ArjenSchwarz/aqua
          - apt-get update
          - apt-get install -y zip
          - cd /go/src/github.com/ArjenSchwarz/aqua
      build:
        script:
          - GOOS=linux GOARCH=amd64 go build -a -ldflags '-s' -o binary
      deploy:
        image: myDeployImage  # environment is set production here (no dev dependencies)
        script:
          - curl -v -u $BB_ACCESS -X POST https://api.bitbucket.org/2.0/repositories/$BITBUCKET_REPO_OWNER/$BITBUCKET_REPO_SLUG/downloads/ -F files=@binary
    
    
    pipelines:
      default:
         - prepare
         - step:
             image: myTestImage # environment is set testing here
             script:
              - go get -t ./...
              - go test ./...
         - build
      branches:
        master:
         - prepare
         - step:
             image: myTestImage # environment is set testing here
             script:
              - go get -t ./...
              - go test ./...
              - go test ./... # even more tests (e.g. integration tests)
         - build
         - deploy
    
  2. Zaki Salleh

    Hi Arjen and Karl,

    Thank you for your feedback on this ticket. We are currently reaching out to customers to understand how they might use multiple steps in their workflow. If you have 30 mins and would like to give me some feedback, please contact me at zsalleh [at] atlassian.com

    Regards, Zaki

  3. Eftymios Sarmpanis

    Multiple steps per branch is the cornerstone of pipelining.

    The most common case that I can think of is having the main branch (usually master), be deployed to a specific environment (say test or snapshot) or even build and push a Dockerized version to a registry. In this scenario at least 2 steps are needed with their own image, cause in the first step the QA is going to run inside its own container (say node, jvm etc) and the second step is going to need a did (docker in docker) container to build and push the image.

    The above is not a corner case, but rather something that a lot of people are doing currently in their ci environment ... why not here as well?

  4. nauman hafiz

    I agree. Currently with one step the "pipeline" is essentially one pipe and it is really difficult to build out any semi-complex CI/CD flow. For example we would like to be able to:
    - run a default/prepare step for all branches
    - split up a branch into multiple steps so it is more manageable
    - run multiple steps with different images

    without some of these features, i feel moving to bitbucket pipelines will be very challenging.

  5. j7b

    I have a requirement to package for multiple distros, the path of least resistance is to start from the correct docker image.

    I really don't want to see the semantics changed as suggested above, I'd prefer

    image: golang
    pipelines:
      default:
        - step:
            name: pogo
            script:
              - go install ./...
        - step:
            name: musl
            image: golang:alpine
            script:
              - apk update && apk add musl-dev git && go install ./...
    

    with name optional identifier for a step, and it'd be nice if steps ran concurrently.

  6. Seppe Stas

    Bump. Since we have a lot of repositories containing both PCB design files and firmware being able to use a separate images to run tests / validations on the design files and firmware would be great.

  7. Ozzy Ndiaye

    Hi,

    I didn't realise this was a missing feature. Very much a requirement for building a value stream map.

    I implemented everything else I needed and this missing feature was very much a disappointment.

  8. Tim Hobbs

    I have three four step use cases:

    1. Simply organizing multiple scripts in a pipeline, this is purely semantics. As a human, comments work OK to organize, but a (visualization) program should not depend on comments.
    2. Script reuse, as @Karl Fritsche has described.
    3. A building block for sequential or concurrent steps, as @j7b mentions. I would like to define fan-out/fan-in flows with mixed sequential and concurrent steps - in other words, complete flexibility.
    4. Use of a different image for each step in a pipeline. A single image with all languages, clients, etc is unmanagable and monolithic. For example, if I want to test a java webapp with browserstack then I need an image with java and ruby.

    As a workaround for simple reuse, all the step.script contents could be externalized to shell scripts and called individually - but surely there is a better way?

  9. Seppe Stas

    For my use-case, running different test/build scripts for different parts of the repo with different Docker images, I've been experimenting with using the services feature to launch the docker images I need and sending it test commands from the main container. Currently the main limitation to this approach is not having access to the "volumes" and "run" docker daemon commands. This means in order to give the services access to the repo contents something like netfs is needed, and to run the commands a sort of RPC system / ssh needs to be used.

    Did anyone try a similar approach or know if it would be possible to create docker volumes in Pipelines?

  10. Mike McGowan

    Multiple steps (each allowing its own image to be specified) is pretty much essential for us too.

    I need at least three steps, each with its own image, for each of the three main languages that comprise our webapp (PHP, Python, JavaScript).

    I don't want to have to create a single monolithic image - with all the languages and associated tooling from my existing images - just to workaround this limitation in Pipelines.

  11. Al McKinlay

    Another example of wanting multiple steps is if you have mutliple languages you want to test.

    We have a repository with php and javascript, but trying to get a docker image with node and php up to date is not fun. I don't want to have to do this myself.

    This way I can have an image for 1 step with php, run my phpunit tests, then another step with node, run my node scripts.

  12. John Lovie Too

    This is great. This is what we really need right now. Like pre-build script, build script and post-build script that will run regardless if the build script fails. Pre-script - installation/configuration/setup before running the scripts/test. build script - running of scripts/tests for build deployment and post-build script that will run for report generation etc.

  13. Mathieux Bergeron

    In our case, we need to build C++ code for different flavors of ubuntu, 14.04, 16.04, 17.04 etc. It's impossible for us to migrate to Bitbucket Pipelines since we are limited to only one image per pipeline.

    Hopefully, the multi step feature should get us there!

  14. Aneita Yang staff

    Hi everyone,

    Thanks for all of the feedback via the survey so far! For anyone who would still like to give input on multiple steps, you can complete the survey here. We will be closing the survey on Thursday 24th.

    Thanks!

  15. Aneita Yang staff

    Thank you to everyone who completed the survey!

    I'm excited to let you know that multiple steps is now in alpha! For anyone who is already part of the alpha group, the feature is available to you right now. If you're not part of the alpha group but would like to trial the feature, you can sign up to be an alpha customer.

    With the alpha version of multiple steps, every step can be configured to:

    • Use a different build environment.
    • Use specific caches and services.

    For more information on multiple steps and instructions on how to set it up, check out our documentation.

    If you have any feedback on the feature, please let us know.

    Thanks!
    Aneita

  16. Marcus Schumann

    I'm not an alpha customer but looking at the documentation it doesn't look like there appears to a way to re-use steps in multiple different pipelines? E.g. all pipelines (feature/*, master, deploy-to-{env}) always performs npm install and npm test and all use node cache. This feature wouldn't help with that if I understood it correctly? This is just to have multiple steps but only within the same pipeline?

  17. Giorgos Gaganis

    Hi

    Kudos for adding the steps! That is a very helpful feauture!

    I am a bit late since the survey has been closed but I would like to share an issue we currently have with pipelines so a solution might be considered in the future. The problem we have is that we have a lot of steps that are in essence independent from one another. In our current configuration we have added them as more script commands and on the whole it works but because they run sequentially the whole pipeline takes a long time to complete.

    For us it would be helpful if we could split our configuration into steps that could also run concurrently. This way we would have used some more minutes on the whole but also greatly improved our minimum feedback time.

    I think something similar was mentioned previously on the thread, eg point #2 by @Tim Hobbs.

    Thanks! Giorgos

  18. Aneita Yang staff

    Hi Marcus and Giorgos,

    The multiple steps feature that we have released to alpha is to allow users to organise their pipeline into a more logical grouping of commands, and to use different build environments and services.

    We have separate open feature requests for reusable steps (#12751) and parallel steps (#14354). I encourage you to vote for and watch those issues if you are interested in them.

    Thanks,
    Aneita

  19. Giorgos Gaganis

    @Aneita Yang Thank you for your swift reply! For me it seems very natural for steps to also be able to run in parallel so in affect also seemed very logical to also comment on this issue.

    But I will be track the other issue as you suggested! Thanks!

  20. Aneita Yang staff

    Hi everyone,

    We're excited to announce that multiple steps is now available to all users, giving you the flexibility to configure Pipelines to better suit your teams' workflow.

    You can add additional steps to your pipeline by modifying your bitbucket-pipelines.yml file to include another step keyword. Each step can be configured to:

    • Use a different Docker image.
    • Use specific caches and services.
    • Produce artifacts that subsequent steps can consume.

    For more information on how to configure multiple steps, check out our documentation.

    As multiple steps is a stepping stone for manual steps and deployments, and parallel steps, we have separate issues to track those. If you have any feedback or new suggestions for the multiple steps feature, please raise a new issue.

  21. Log in to comment