[PATCH WIP 0/4] Special code path for large blobs

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Thread "Problem with large files on different OSes" reminds me this.
This series is in my repository for quite some time. It addresses
adding/checking out large blobs as long as:

 - no conversion will be done
 - blobs are loose (in checkout case)

Together with a patch that prevents large blobs from being packed
(something like Dana How sent long ago), and a modification of "lazy
clone/remote alternatives" patch to avoid packing large blobs again
for sending over network, I think it should make git possible for
large files.

Just something to play.

Nguy�n Thái Ng�c Duy (4):
  convert.c: refactor in order to skip conversion early without looking
    into file content
  sha1_file.c: add streaming interface for reading blobs
  write_entry: use streaming interface for checkout large files
  index_fd: support indexing large files

 cache.h     |   10 +++
 convert.c   |   86 ++++++++++++++++++++---
 entry.c     |   68 +++++++++++++++++
 sha1_file.c |  233 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
 4 files changed, 388 insertions(+), 9 deletions(-)

--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]