Re: Adding a new file as if it had existed

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Tuesday 2006 December 12 11:32, Bahadir Balban wrote:

> If I don't know which files I may be touching in the future for
> implementing some feature, then I am obliged to add all the files even
> if they are irrelevant. I said "performance reasons" assuming all the
> file hashes need checked for every commit -a to see if they're
> changed, but I just tried on a PIII and it seems not so slow.

Here's a handy rule of thumb I've learned in my use of git:

 "git is fast.  Really fast."

That'll hold you in good stead.  In my experience there is no operation in git 
that is slow.  I've got some trees that are for embedded work and hold the 
whole linux kernel, often more than once.  Subversion, which I used 
previously, took literally hours to import the whole tree.  Git takes 
minutes.

As to your direct concern: git doesn't hash every file at every commit.  There 
is no need.  git has an "index" that is used to prepare a commit; at the time 
you do the actual commit, git already knows which files are being checked in.  
Obviously, Linus uses git for managing the linux kernel, he's said before 
that he wanted a version control system that can do multiple commits /per 
second/.  git can do that.

In short - don't worry about making life easy for git - it's a workhorse and 
does a grand job.


Andy
-- 
Dr Andy Parkins, M Eng (hons), MIEE
andyparkins@xxxxxxxxx
-
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]