cloning of large remote repositories fails due to timeout

Issue #4630 invalid
Berkey Michaels created an issue

Hi, I have been trying to clone git:// (large repository) via "Repositories" / Import repository

However, bitbucket always times out (cloning at gitorious or github works however), sometimes a proper error report is sent (but usually not).

While the website says that repository size is not limited by default, there is obviously a hard limit due to cloning timeout, as can be seen here?

It would be great if you could scale the timeout according to the repository size, and simply renice the background process, so that large clones are handled properly, but at a lower priority (CPU/BANDWIDTH-wise).


Comments (28)

  1. Marcus Bertrand staff

    While there aren't limits, there is a hard timeout that will occur if the repository is quite large. How much data are you trying to import?

  2. Berkey Michaels reporter

    Like I said, it's a huge repository, the repository itself is ~ 5-6 gb - and it does take a while to clone, even on a dedicated server with a 100MBIT connection.

    That said, gitorious and github seem to be able to deal with it - in fact, there are dozens of clones of this repository.


  3. Marcus Bertrand staff

    For a repository that large, I would recommend using SSH. But it is also worth noting that any repo over 1GB may have issues using Git/Mercurial with any service. If you'd like more specific information about how you can reduce the size of your repository or remove old binaries, please send a request to and we'll give you more specific detailed information.

  4. Berkey Michaels reporter

    I am not sure if the status should be changed to INVALID: Yes, it's a large repository, but there are much larger repositories used with git.

    This is not about requiring information on reducing the size of the repository: It's not my repository, it's out of my control - and other services like github and gitorious manage the size quite well, including some repositories even much larger than this.

    If you persist on keeping this INVALID, you should definitely mention these restrictions on your website, i.e. in your FAQs. Just suggesting that this is a git-specific problem is not helping, and doesn't address the fact that other git-hosting services manage hosting such repositories quite well.

  5. Eric Peterson

    Running into the same problem. I definitely do not agree that this is invalid unless you have specified a repository size restriction.

  6. Julien Leloup

    Same thing here, from a push perspective : a newly created project from the Unreal Engine 4 weights nearly 1.6 GB. It cannot be pushed due to this timeout restriction.

    Is there any plan to change this timeout ? Or to allow users to modify it within a reasonnable range ?

  7. Matt Sanders

    I have never had problem with my small line of business type applications, but have now been playing with Unity3d and the repo size is much bigger. I have not been able to download the repo I pushed up onto my laptop.

  8. Alan Agon

    Similar sort of issue as the others here. We can't clone our repo at all without the remote hanging up. I've changed the default buffer to no avail.

  9. Xander Smalbil

    What the hell is going on. Github repo's I can pull, Bitbucket hangs with index pack failed. Tried multiple repo's, same stuff.

  10. Matt Sanders

    Is there another version of this that has not been marked "invalid"? This is a legitimate issue especially pulling down repos when travelling and on extremely slow internet connections.

  11. Ryan Brady

    For most people 1gb is enough. But large repos are necessary for some projects. I don't think this issue should be invalid.

  12. Andre Posch

    Same issue here, have an old project repo with large size (2 GB). Now I wanted to clone it again after some time and I can't clone it anymore because of errors after approximately half of the data is downloaded (on a very fast line). There is not even the option to get a zipped version of some revision or something like that on the website that works - the download function says repo size too big. No way to get the data.

  13. Bart de Boer

    Same issue here, for a repository that is a whopping 115.7 MB large. command line git, sourcetree: all clones/pull all fails. NOT an invalid issue. If this isn't solved soon, we'll be forced to move to github...

  14. David Burgess

    Same issue here, the clone keeps on timing out. If I had a faster connection it would work. Our repo is just over 1GB, but it's taking over 30 min to clone (my hunch is there is a timeout limit at 30 min) It reaches 93% and then fails with "remote end hung up" or some similar message. Is there a way to configure the timeout ??? Yes I know I can start with a local copy and/or do a shallow clone, but these options are not available when running from Team City, so we're stuck.

  15. David Burgess

    My solution was to upgrade to a 64 bit version of Git. Once I'd done that, the transfer rate was much higher and I didn't run into the timeout problems.

  16. Matt Kincaid

    You can download the repository as a zip from the website. Then load into a new clean repository. Alternatively you can specify a history depth when pulling which might help.

    In general this is a learning point about what you can and cant put into a git repo.

  17. Gaurav Raghav

    Any Update on this issue @Marcus Bertrand . We have a 280 MB repository and the pull is constantly failing via eclipse/sourcetree etc. Is there any work around which we can do to get it going.

  18. David Burgess

    Make sure you’re using a 64 bit version of Git. I use TortoiseGit and switched from the 32 bit version to 64 bit. Everything speeded up, and I no longer got timeouts. Our repo. size is over 1GB

  19. Log in to comment