Uploaded image for project: 'Bitbucket Cloud'
  1. Bitbucket Cloud
  2. BCLOUD-18262

Feature request - Pipeline command to modify repository/deployment variables

    • Our product teams collect and evaluate feedback from a number of different sources. To learn more about how we use customer feedback in the planning process, check out our new feature policy.

      It would be handy to have commands available to the pipeline for management of deployment and repository variables. I realise there is an API to handle this but a simple one line command would be much more practical.

      My use case is that I have pipelines to create a a number of AWS CloudFormation stacks and for deployment of applications to AWS. I want to store some information about these stacks and applications somewhere that can be passed to future pipelines e.g. Elastic Beanstalk EnvironmentId, ApplicationName, EnvironmentName, StackId etc. These values change due to blue/green deployments, new stacks being spun up and old ones being terminated. Manually changing them is error prone and writing code to make api calls for a simple variable update seems overkill.

      I have worked around this by storing the information in AWS SSM ParameterStore which works well.

      An example of retrieving a value from here is as simple as:

      #!bash
      export EB_APP_NAME=$(aws ssm get-parameter --name /eb/$1/$ENV/application-name --query 'Parameter.Value' --output text)
      export EB_ENVIRONMENT_ID=$(aws ssm get-parameter --name /eb/$1/$ENV/environment-id --query 'Parameter.Value' --output text)
      export EB_ENVIRONMENT_NAME=$(aws ssm get-parameter --name /eb/$1/$ENV/environment-name --query 'Parameter.Value' --output text)
      

      And overwriting an existing parameter using Boto3:

      #!python
      
      ssmClient.put_parameter(
                  Name='/eb/' + application_name + '/' + env_type + '/environment-name',
                  Value=env['EnvironmentName'],
                  Description='Elastic beanstalk environment name for ' + args.env_type,
                  Type='String',
                  Overwrite=True
              )
      

      In short, a command line tool to complement the API would be useful, much like the AWS CLI complements their APIs.

            [BCLOUD-18262] Feature request - Pipeline command to modify repository/deployment variables

            I have a similar requirement. Our pipeline jobs need to create an external resource that is used in the tests. We assign an environment variable that contains the name of the resource to create and use.

            To achieve intra-step-persistent variables (available from one line of the step to the next), we use a shell script to set up and write the values as bash environment variables:

            #!bash
            
            BASH_ENV=~/.bashrc
            echo 'export CI_ENV=ci-$BITBUCKET_BUILD_NUMBER' >> $BASH_ENV
            

            Then, in our bitbucket-pipelines.yml:

            #!yaml
            
            image: quay.io/pantheon-public/build-tools-ci:4.x
            
            pipelines:
              default:
                - step:
                    caches:
                      - composer
                    script:
                      - apt-get update && apt-get install -y unzip
                      - curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
                      - composer install --no-ansi --no-interaction --optimize-autoloader --no-progress
                      - ./.bitbucket/set-up-globals.sh
                      - source ~/.bashrc && echo "CI env is $CI_ENV"
            

            Using this technique, we can create as many environment variables as we need, calculated at runtime. We need to source the .bashrc file on every line where the variables will be used, though.

            To make this technique more convenient, Bitbucket pipelines could do three simple things:

            1. Define the BASH_ENV variable automatically, so that we do not need to define it in scripts or hardcode it. (It can point at any writable location, e.g. $HOME/.bashrc)
            2. Source the file indicated by $BASH_ENV at the start of each line of every step, if it points at a file that exists.
            3. Introduce a place in the bitbucket-pipelines.yml file to define key : value pairs that are evaluated via bash and then appended to the end of the $BASH_ENV file as an 'export' line.

            Each of these things would provide incremental value; it would not be necessary to provide all three at once.

            greg-1-anderson added a comment - I have a similar requirement. Our pipeline jobs need to create an external resource that is used in the tests. We assign an environment variable that contains the name of the resource to create and use. To achieve intra-step-persistent variables (available from one line of the step to the next), we use a shell script to set up and write the values as bash environment variables: #!bash BASH_ENV=~/.bashrc echo 'export CI_ENV=ci-$BITBUCKET_BUILD_NUMBER' >> $BASH_ENV Then, in our bitbucket-pipelines.yml: #!yaml image: quay.io/pantheon- public /build-tools-ci:4.x pipelines: default : - step: caches: - composer script: - apt-get update && apt-get install -y unzip - curl -sS https: //getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer - composer install --no-ansi --no-interaction --optimize-autoloader --no-progress - ./.bitbucket/set-up-globals.sh - source ~/.bashrc && echo "CI env is $CI_ENV" Using this technique, we can create as many environment variables as we need, calculated at runtime. We need to source the .bashrc file on every line where the variables will be used, though. To make this technique more convenient, Bitbucket pipelines could do three simple things: Define the BASH_ENV variable automatically, so that we do not need to define it in scripts or hardcode it. (It can point at any writable location, e.g. $HOME/.bashrc) Source the file indicated by $BASH_ENV at the start of each line of every step, if it points at a file that exists. Introduce a place in the bitbucket-pipelines.yml file to define key : value pairs that are evaluated via bash and then appended to the end of the $BASH_ENV file as an 'export' line. Each of these things would provide incremental value; it would not be necessary to provide all three at once.

            Hi Anita,

            Thanks for responding.

            I guess it seemed more appropriate to store the variables in Pipelines as it’s Pipelines that is consuming them. That said, I don’t really have a problem storing the variables in AWS SSM ParameterStore.

            To answer your other question, in almost all circumstances, they are to be used for subsequent pipelines in the same repo. The exception is for repos that create CloudFormation stacks that launch an ElasticBeanstalk Environment. There is one repo responsible for creating and updating the stack using CloudFormation templates and another repo with the application code for building and deploying the application. After stack creation, the stackId and elastic beanstalk environment variables are stored. The application code repo later makes use of the beanstalk variables for deployment purposes and is also able to modify the variables as environments get replaced etc.

            Hopefully that make sense.

            Thanks,
            Craig

            craigchapman1975 added a comment - Hi Anita, Thanks for responding. I guess it seemed more appropriate to store the variables in Pipelines as it’s Pipelines that is consuming them. That said, I don’t really have a problem storing the variables in AWS SSM ParameterStore. To answer your other question, in almost all circumstances, they are to be used for subsequent pipelines in the same repo. The exception is for repos that create CloudFormation stacks that launch an ElasticBeanstalk Environment. There is one repo responsible for creating and updating the stack using CloudFormation templates and another repo with the application code for building and deploying the application. After stack creation, the stackId and elastic beanstalk environment variables are stored. The application code repo later makes use of the beanstalk variables for deployment purposes and is also able to modify the variables as environments get replaced etc. Hopefully that make sense. Thanks, Craig

            Aneita added a comment -

            Hey @craigchapman1975,

            Thanks for reaching out and for the suggestion. I can see why automating this process would be handy.

            Given our current priorities, it's unlikely that we'll support this anytime soon. In the meantime however, I'll open this ticket to gauge the interest of other users in seeing the same thing.

            To help me better understand, why is there a preference for storing these variables inside of Pipelines instead of in AWS SSM ParameterStore? Also to clarify, are these variables required for subsequent pipelines within the same repo or in other repos under the same account?

            Thanks,
            Aneita

            Aneita added a comment - Hey @craigchapman1975, Thanks for reaching out and for the suggestion. I can see why automating this process would be handy. Given our current priorities, it's unlikely that we'll support this anytime soon. In the meantime however, I'll open this ticket to gauge the interest of other users in seeing the same thing. To help me better understand, why is there a preference for storing these variables inside of Pipelines instead of in AWS SSM ParameterStore? Also to clarify, are these variables required for subsequent pipelines within the same repo or in other repos under the same account? Thanks, Aneita

              Unassigned Unassigned
              ed80e132eae6 craigchapman1975
              Votes:
              10 Vote for this issue
              Watchers:
              8 Start watching this issue

                Created:
                Updated: