On Tue, Mar 11, 2008 at 2:53 PM, Andreas Hildebrandt <anhi@xxxxxxxxxxxxxxxx> wrote: > I absolutetly agree that it's strange. The main reason for this is that > we have some collections of data files (some of them pretty large) that > can be compressed pretty effectively. At compilation time, it is decided > if the files are needed or not. If so, they are extracted. In the end, > the .tar.gz files are deleted since they are no longer needed. In This is probably a silly questions: why are you actually deleting the tar.gz files? If they compress the data files well, then any storage requirement for keeping them is dwarfed by the size of the unpacked data files (presumably). If a minor tweak to your build process avoids more complicated git scripting, that sounds a reasonable trade-off. -- cheers, dave tweed__________________________ david.tweed@xxxxxxxxx Rm 124, School of Systems Engineering, University of Reading. "while having code so boring anyone can maintain it, use Python." -- attempted insult seen on slashdot -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html