Pipeline Docker image that mounts S3 repository

Issue #13368 resolved
Bjorn Harvold created an issue

I have a Docker image I want to use: https://hub.docker.com/r/xueshanf/s3fs/

The image will mount an S3 bucket for me. I think this image can be really helpful for running Maven builds that like to download the world. Here I can have a custom settings.xml point to the S3 bucket where all my external dependencies exist. My build / deploy time would dramatically go down.

There are a few caveats on how I would securely configure the container. It requires that the host has a .s3fs file containing my AWS access and secret keys. I could host the image in a private repository but that's not ideal if this is going to be a solution for all your users.

I'd love some feedback on what you think about this solution and how you could potentially add it to Bitbucket Pipelines.

Cheers, Bjorn

Comments (5)

  1. Bjorn Harvold reporter

    Hi Sam,

    The question here would be about timing. If I specify an image I want to use, this image already expects the .s3fs file to be present. By the time we get to my yaml file, it's already too late. It also comes with a docker-compose.yml file to start it properly. You can find it here: https://github.com/xueshanf/docker-s3fs.

    Let me know if this could potentially work. If not I see this as a great feature for Pipelines.

    Cheers, Bjorn

  2. Samuel Tannous staff

    Hi Bjorn,

    The image doesn't start s3fs by default so you'll have to add the s3fs command from the docker compose example to your yaml file. If you do that after writing the .s3fs file timing shouldn't be an issue.

    Regards Sam

  3. Log in to comment