single large file causes out of memory

Create issue
Issue #278 wontfix
Former user created an issue

This will create a Subversion repository with a single revision and a single file which hgsubversion cannot clone: {{{ #!/bin/bash -ex for d in hgmemorybug_svnrepo hgmemorybug_svnwc hgmemorybug_hg; do test -d "$d" && rm -rf "$d" done svnadmin create hgmemorybug_svnrepo svn co file://pwd/hgmemorybug_svnrepo hgmemorybug_svnwc cd hgmemorybug_svnwc dd if=/dev/zero bs=1M count=1000 of=bigfile svn add bigfile svn commit -m 'a big file' cd .. hg clone file://pwd/hgmemorybug_svnrepo hgmemorybug_hg }}} Fails with {{{ abort: out of memory }}} This is on a 32-bit machine, and this may be failing due to underlying limitations on the part of hg, rather than hgsubversion. If it succeeds on a 64-bit machine, try s/count=1000/count=3000/.

Similar to, but not the same as, Issue #110

Comments (2)

  1. Augie Fackler repo owner

    hg has known limitations with large files. I guess if someone wanted to do some bfiles support in hgsubversion I'd be semi-open to it, but I'm not holding my breath.

  2. Log in to comment