Re: EXT :Re: GIT and large files

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



"Stewart, Louis (IS)" <louis.stewart@xxxxxxx> writes:

> Thanks for the reply.  I just read the intro to GIT and I am
> concerned about the part that it will copy the whole repository to
> the developers work area.  They really just need the one directory
> and files under that one directory. The history has TBs of data.

Then you will spend time reading, processing and writing TBs of data
when you clone, unless your developers do something to limit the
history they fetch, e.g. by shallowly cloning.

>
> Lou
>
> -----Original Message-----
> From: Junio C Hamano [mailto:gitster@xxxxxxxxx] 
> Sent: Tuesday, May 20, 2014 1:18 PM
> To: Stewart, Louis (IS)
> Cc: git@xxxxxxxxxxxxxxx
> Subject: EXT :Re: GIT and large files
>
> "Stewart, Louis (IS)" <louis.stewart@xxxxxxx> writes:
>
>> Can GIT handle versioning of large 20+ GB files in a directory?
>
> I think you can "git add" such files, push/fetch histories that contains such files over the wire, and "git checkout" such files, but naturally reading, processing and writing 20+GB would take some time.  In order to run operations that need to see the changes, e.g. "git log -p", a real content-level merge, etc., you would also need sufficient memory because we do things in-core.
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html




[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]