-
Suggestion
-
Resolution: Unresolved
It would be handy to have commands available to the pipeline for management of deployment and repository variables. I realise there is an API to handle this but a simple one line command would be much more practical.
My use case is that I have pipelines to create a a number of AWS CloudFormation stacks and for deployment of applications to AWS. I want to store some information about these stacks and applications somewhere that can be passed to future pipelines e.g. Elastic Beanstalk EnvironmentId, ApplicationName, EnvironmentName, StackId etc. These values change due to blue/green deployments, new stacks being spun up and old ones being terminated. Manually changing them is error prone and writing code to make api calls for a simple variable update seems overkill.
I have worked around this by storing the information in AWS SSM ParameterStore which works well.
An example of retrieving a value from here is as simple as:
#!bash export EB_APP_NAME=$(aws ssm get-parameter --name /eb/$1/$ENV/application-name --query 'Parameter.Value' --output text) export EB_ENVIRONMENT_ID=$(aws ssm get-parameter --name /eb/$1/$ENV/environment-id --query 'Parameter.Value' --output text) export EB_ENVIRONMENT_NAME=$(aws ssm get-parameter --name /eb/$1/$ENV/environment-name --query 'Parameter.Value' --output text)
And overwriting an existing parameter using Boto3:
#!python ssmClient.put_parameter( Name='/eb/' + application_name + '/' + env_type + '/environment-name', Value=env['EnvironmentName'], Description='Elastic beanstalk environment name for ' + args.env_type, Type='String', Overwrite=True )
In short, a command line tool to complement the API would be useful, much like the AWS CLI complements their APIs.
I have a similar requirement. Our pipeline jobs need to create an external resource that is used in the tests. We assign an environment variable that contains the name of the resource to create and use.
To achieve intra-step-persistent variables (available from one line of the step to the next), we use a shell script to set up and write the values as bash environment variables:
Then, in our bitbucket-pipelines.yml:
Using this technique, we can create as many environment variables as we need, calculated at runtime. We need to source the .bashrc file on every line where the variables will be used, though.
To make this technique more convenient, Bitbucket pipelines could do three simple things:
Each of these things would provide incremental value; it would not be necessary to provide all three at once.