Support Git LFS in the Bitbucket Pipelines

Issue #13208 resolved
Alexey Gorshkov created an issue

At the moment it is impossible to build the project, if there is a file uploaded via Git LFS.

Official response

  • Davina Adisusila staff

    Hi everyone,

    We have shipped Git LFS support as an opt-in feature to all Pipelines users!

    To opt in, include the following in your bitbucket-pipelines.yml

      lfs: true
      # ... rest of Pipelines configuration ...

    This enables the downloading of LFS files with the git clone in the Build setup of your pipelines.

    Have a look at the blog and documentation for additional information.


Comments (42)

  1. william welch

    we found a workaround -- install git-lfs on your image and get ssh set up so that you can:

    git lfs pull ssh://

    the lfs fetch is very fast compared to doing it locally (we are using Bitbucket LFS)

  2. Vegard Hansen

    Do you mind expanding on how you managed to get it to work? I'm just getting spammed with warning: current Git remote contains credentials

  3. Vegard Hansen

    Your workaround does not seem to work anymore. In the build log I can see that it finds the objects, but it's unable to download the objects. Git LFS does not support ssh:// as protocol, so I'm not sure how you get it to work as that.

    Also interesting that not being able to download any objects has an exit status of exit 1, in my mind that seems like a pretty massive failure and should terminate the build. Took some time for me to actually see that it was failing. You also need to enable tracing to see that what it's actually failing on: GIT_TRACE=1 git lfs pull. In short it seems to fall back on basic authentication, using the tokens, which does not work. You get 401.

    For others who manages to find their way in here through the jungle of SEO which makes everything related to Atlassian-products impossible to debug and Google: In the current state you should only consider to use pipelines and git lfs if your repositories are public. Don't waste your time until Atlassian addresses these somewhat fundamental issues directly or provides documentations that shows you how to make it work. Not worth your time.

  4. william welch

    Ah, I see the confusion. We are using a custom docker image that has our ssh credentials burned in. It's a version of the image we deploy, so of course it must be able to pull the private repo (and does).

  5. Vegard Hansen

    The documentation on git lfs is so contradictive. On the one hand they say only http(s) endpoints with basic authentication works. But on the other hand I have never set that up locally, and it works perfectly. Looking at the issue tracker they seem to interchangeably use the term ssh and http. But it looks like the actual download is done over http(s). It's just very confusing.

    Looking at git lfs env it's pretty obvious why it's failing, they are adding an endpoint to that doesn't exist or doesn't work with git lfs. Normally when you use an SSH endpoint / remote it should ask for git-lfs-authenticate which returns a token that is valid for a certain time. This is then in turn used to download the files over http(s). This does not currently work since the endpoint is set to a non-functioning URL. I don't even understand where that URL comes from.

    It's in the form like this, if you look locally you will see that you have two endpoints, one for https and one for ssh:
    Endpoint= (auth=none)

  6. william welch

    To clarify my position on this issue, I did not offer my workaround as justification for the bug being marked "minor", it was more to share a solution with fellow travelers, having experienced Atlassian's legendary customer support on previous cloud offerings (living and EOL'd).

    Best regards,

  7. Vegard Hansen

    I completely understand, a little odd that we get to assign the priority when reporting issues.

    I'm just keeping my finding in this thread in case someone finds it and to let the staff have it all at the same place.

  8. Clemens Reisner

    Same problem here. Is there a known workaround for private repositories? I don't like to add ssh credentials or deployment keys within my docker container.

  9. Janne Nykänen

    To add my 50 cents: I managed to get LFS working in this way:
    - created a public base image with git LFS binary installed (in addition to tools I needed for the build): see example here:
    - in my bitbucket-pipelines.yml, I'm setting up a bitbucket ssh read-only access (my user is read-only user) for the same repository, using the environment variable I added holding base64 encoded private key ("secret environment variable"). Assuming the variable name is MY_PEM_BASE64 and you clone your repo using url, you could do it like this in the pipelines:

    image: huippujanne/python-27-awscli-git-lfs
        - step:
              - mkdir /root/.ssh/ && chmod 700 /root/.ssh
              - echo "$MY_PEM_BASE64" | base64 -d > /root/.ssh/bb_rsa && chmod 600 /root/.ssh/bb_rsa
              - (echo 'StrictHostKeyChecking no'; echo 'Host'; echo '  IdentityFile /root/.ssh/bb_rsa') >> /root/.ssh/config && chmod 700 /root/.ssh/config
              - git remote remove origin
              - git remote add origin
              - TRACE=2 git-lfs pull
              - <here do the actual thing>
  10. Farkas Levente

    do you really think it's a minor bug? who can we build an lfs enabled repo in this case? since it seems bitbucket heavily support/push the usage of the lfs but running a pipelines with these repo currently very dirty!?

  11. Matt F

    Can pipelines incompatibility be added to "Current limitations for Git LFS with Bitbucket" page?

    Maybe it's documented elsewhere, but this issue was the only official recognition of the limitation I've found.

    This was a real stumper for me after I found out that Deployment keys don't work with LFS. Thanks Janne for the suggestion to create a read-only user. Works great for my small team but that's sadly not a good option for teams that already have 5/10/25/50/100 users and would need to pay more to upgrade just to create a read-only user to work around this limitation. If BB could add deployment key support to LFS, that would at least solve the extra user problem. I know I could use an "App password", but the fact that links to my user for automated processes put me off.

  12. Andron Ocean

    I too would like to see this elevated above a minor bug. It's a real pain that this doesn't just work. I also agree with Matt about adding it to the "Current Limitations" page -- it's confusing that this thread is essentially the only place online where you can find out that LFS and BB pipelines don't play well together.

  13. Gert-Jan Rebel

    Sounds like a great feature to have out of the box down the the line!

    For those who want a solve now: I've just extended my node image with git-lfs and it works like a charm.

    These are the steps i took:

    Hope it helps anyone!

  14. Matt Patterson

    I second the notion that this is not a minor bug. Given Atlassian is pushing Bitbucket with lfs, then please support git-lfs pull...please.

  15. Nick Chapman

    I just want to say that Janne Nykänen's answer works in the interim while Atlassian gets their act together. It is quite annoying that using LFS breaks pipelines without a hacky work around. I hope they do something to fix this soon.

  16. Nicolas Albert

    I have also do a custom docker image with git-lfs already installed and add in script :

    • git config remote.origin.url git config remote.origin.url | sed -n "s,.*\?:,https://$,p"
    • git lfs pull
  17. Omar Ali

    This issue has been open for about a year now and I fail to see the complexity of adding this.

    Can you please take an action?

  18. Phil Gooch

    I've just enabled LFS and encountered the same issue - it breaks my build.

    Don't want to do a hacky workaround, would be nice to have git lfs support in the pipeline.

    For now, I will have to remove LFS

  19. Jens Bannmann

    We have the same problem. Implementing the workaround with git lfs pull is tedious and really shouldn't be necessary. Please fix this!

  20. Gene Podolyak

    Just an hour ago I read:

    Fun fact: Steve Streeting, the Atlassian developer who invented Sourcetree, is also a major contributor to the Git LFS project, so Sourcetree and Git LFS work together rather well.

    The real "fun fact" is this ticket is opened for 16 months. I will have fun reverting my LFS changes...

  21. Emmanuel Lange

    I think I have found a workaround that should work with any Docker.
    One needs to add the pipeline ssh public key as a key in bitbicket.
    The key is in setting->pipelines->ssh keys
    no need to add known host.

    the script does this:

    • install git-lfs repository using the standard command
    • apt-get install git-lfs
    • Pull the lfs data.

    added bonus: it's extremely fast (>80MB/s)

    image: python:3.6.4
        - step:
              - curl -s | bash
              - apt-get -y install git-lfs zip 
              - git lfs pull
  22. Dmitriy Belyaev

    Ok, after all the suggestions mentioned in this issue I managed to build my repository that has some LFS files.

    Steps to do this:

    • Prepare custom docker image with built-in Git LFS (alpine based Dockerfile example, lines 18-25) and in case of alpine distribution I also added openssh software in order to be able to use SSH from within alpine image.
    • Allow SSH access from Pipelines to your repository:
      • go to '<Repository> \ Settings \ Pipelines \ SSH keys' and copy public key (click 'Copy public key' button)
      • go to '<Repository> \ Settings \ General \ Access keys' and add this copied pipelines' public key to the list of allowed SSH keys on your repository (click 'Add key' button and insert copied public key, give it a reasonable label)
    • add an additional Git LFS pull at the beginning of your build script in order to pull large files from repository.
      In my case bitbucket-pipelines.yml build script looks like this:
    image: trustypanda/maven:alpine-3.5.2-openjdk8-lfs
        - step:
              - maven
              # Modify the commands below to build your repository.
              - git lfs pull
              - cd src # change to project directory
              - mvn -B verify # -B batch mode makes Maven less verbose

    Enjoy successful builds on your LFS repo!

    Of course, you can avoid building custom docker image with pre-built Git LFS and do Git LFS install step during your build step instead (as @emmanuellange mentioned), but then you'll spend your Pipelines build time usage on download\installation of Git LFS everytime you run your Pipelines builds and time is money in terms of Pipelines :)

  23. Matt Ryall

    Thanks for sharing your steps to get this working, @pure-apricot.

    We realise the steps required for this are far from ideal, and it seems relatively straightforward to fix this on our side. So we're now planning to add built-in LFS support to Pipelines in the near future.

    At the moment, we're planning for it to be an option in the YAML file to automatically pull LFS files when cloning, under the existing clone configuration section like this:

      lfs: true
      # ...

    We'll keep this ticket updated with our progress.

  24. Dmitriy Belyaev

    It would be great to avoid all this hassle with custom docker image setup and use built in functionality in Bitbucket!

  25. Davina Adisusila staff

    Hi everyone,

    We have shipped Git LFS support as an opt-in feature to all Pipelines users!

    To opt in, include the following in your bitbucket-pipelines.yml

      lfs: true
      # ... rest of Pipelines configuration ...

    This enables the downloading of LFS files with the git clone in the Build setup of your pipelines.

    Have a look at the blog and documentation for additional information.


  26. Matteo Scanzano

    It seems not to work with custom pipelines:

    I have lfs: false in my configuration and lfs files are not downloaded in the default pipelines (ok), but are downloaded in my custom pipeline (am I missing something?)

    I get this in pipeline logs:

    Cloning into '/opt/atlassian/pipelines/agent/build'...
    + GIT_LFS_SKIP_SMUDGE=1 git clone -n https:// [...]
    Downloading path/to/file1 (1.0 MB)
    Downloading path/to/file2 (1.1 MB)
  27. Davina Adisusila staff

    Hi @scanzy

    The clone command above has the skip smudge flag, which indicates the feature is working as expected.

    What are the differences between your default and custom pipelines?
    and have you tried running the clone command with the flag locally at those commits?
    The flag ensures missing LFS objects are not downloaded, but files that are still in git history and exist in your .git/lfs/objects will still be downloaded.

    If you think it is still an issue with Pipelines, please raise a support ticket here so the Atlassian team can investigate further.


  28. Octavio Fernandes

    My pipeline is using microsoft/dotnet2.1 as base image.
    I am adding the config to enable lfs during clone as stated in the documentation. However, no files get downloaded from lfs during the clone.

    If I explicitly download git-lfs and run git lfs install before cloning, then it works.

    Does anybody know the reason for this?

  29. Log in to comment