Re: [PATCH] git-p4: improve performance with large files

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I don't understand the point of trying to save the 32 mb, if people
are sending you blobs that are that large.

The approach to avoid sequences of appends looks sound.


On Thu, Mar 5, 2009 at 10:14 PM, Junio C Hamano <gitster@xxxxxxxxx> wrote:

>> ... The ideal solution is to use a generator and refactor the commit
>> handling as a stream. I am working on that but it involves deeper
>> changes, so as I am not sure it will be accepted, I'm providing the
>> attached compromise patch first. At least it solves the appaling speed
>> issue. I tuned it so that it never uses more than 32 MiB extra memory.

>> +            text += ''.join(data)
>> +            del data

i'd say

  data = []

add a comment that you're trying to save memory. There is no reason to
remove data from the namespace.

-- 
Han-Wen Nienhuys
Google Engineering Belo Horizonte
hanwen@xxxxxxxxxx
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux