many files, simple history

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



short version: will git handle large number of files efficiently if
the history is simple and linear, i.e., without merges?

longer version: i'm considering using git to keep track of my
installed user programs/files on my linux machine, mainly because i
want to be able to uninstall software cleanly and completely (i almost
always build from source code and install in non-standard locations).
so i would want to use git to keep track of every program i install or
uninstall. this way, i could go back and uninstall a program simply by
finding the commit when it was installed, get the list of files that
were added as a result, and remove them (and of course, commit the
removals into git so i can always undo the uninstall too!)

so the history stored in git will be linear and consist of file
additions and removals. but this will potentially involve many files.
will git be able to handle this (as yet hypothetical) situation
efficiently?

/ali
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]