Support executing pipeline on user provided infrastructure

Issue #15674 wontfix
Amit Prakash Ambasta
created an issue


Currently pipelines execute on Bitbucket provided infrastructure. This runs into multiple issues such as:

  1. Inability to use existing fixtures/databases/services for code being tested.
  2. Security limitations such as executing pipeline over instances with pre-assigned roles (currently this is handled via SECRET/ACCESS keys)
  3. Copying large amounts of data as opposed to using a pre-existing container with pre-loaded data

The ability to deploy pipelines to user provided infrastructure should mitigate this! Please enable support for the same

Comments (2)

  1. Matt Ryall staff

    Thanks for the suggestion, Amit, but we don't plan to offer this any time soon.

    Pipelines operates as a fully hosted service, and we think offers a lot of convenience to our customers and also means we can deliver new functionality quickly. We could not evolve our product as quickly if we also ran on customer infrastructure.

    We'd like to instead work out how we can address these concerns in other ways:

    1. We believe this is best addressed by moving your fixtures into containers that can be started in a test environment. This makes testing easier for developers as well as consistent between different environments.
    2. We'd like to add AWS IAM role support to Pipelines at some point in the future. I can't see a ticket open for this yet, but if you would like something like this for AWS or another platform, please request it.
    3. We have #14588 as a request for volume mapping to support test data in containers. For the moment, any data stored in S3 in a "useast" AWS region should be quite fast to load.

    I hope this helps. Please let me know if you have any further questions or suggestions.


  2. Log in to comment