On Wed, 28 May 2014, Junio C Hamano wrote:
David Lang <david@xxxxxxx> writes:
On Wed, 28 May 2014, Dale R. Worley wrote:
It seems that much of Git was coded under the assumption that any file
could always be held entirely in RAM. Who made that mistake? Are
people so out of touch with reality?
Git was designed to track source code, there are warts that show up in
the implementation when you use individual files >4GB
such files tend to also not diff well. git-annex and other offshoots
hae methods boled on that handle such large files better than core git
does.
Very well explained, but perhaps you went a bit too far, I am
afraid.
The fact that our primary focus being the source code does not at
all mean that we are not interested to enhance the system to also
cater to those who want to put materials that are traditionally
considered non-source to it, now that we have become fairly good at
doing the source code.
Correct, I didn't mean to imply that git is only for source files, just noting
it's origional purpose.
now that there are multiple different add-ons for git to handle large files in
different ways, I'm watching to see what can get folded back into the core.
David Lang
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at http://vger.kernel.org/majordomo-info.html