On Sat, 19 Dec 2009, Bill Lear wrote: > On Saturday, December 19, 2009 at 22:15:00 (-0500) Nicolas Pitre writes: > >On Sun, 20 Dec 2009, Johan 't Hart wrote: > > > >> Is git able to handle 4Gig files? I've heard git loads every file completely > >> in memory before handling it... > > > >Right. Sowith current Git you will be able to deal with 4GB files only > >if you have a 64-bit machine and more than 4GB of RAM. > > ?? > > % uname -a > Linux pppp 2.6.31.6-166.fc12.i686 #1 SMP Wed Dec 9 11:14:59 EST 2009 i686 i686 i386 GNU/Linux > % cat /proc/meminfo | grep MemTotal > MemTotal: 3095296 kB > % mkdir gogle > % cd gogle > % git init > % dd if=/dev/zero of=zerofile.tst bs=1k count=4700000 > % git add * > % git commit -a -m new > [master (root-commit) 35a25be] new > 1 files changed, 0 insertions(+), 0 deletions(-) > create mode 100644 zerofile.tst > % git --version > git version 1.6.5.7 > > Seems ok to me... That's the easy part. Diffing such files and delta compressing them, or even checking them out especially when delta compressed, just won't work if you don't have the RAM. Fixing this limitation would introduce significant complexity in the code that no one felt was worth it. I had some thoughts about supporting the addition of really huge files in a Git repository where only add/commit/checkout/fetch/push would work with no delta compression. That didn't materialized yet though. Nicolas -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html