On Tue, Mar 11, 2008 at 3:53 PM, Andreas Hildebrandt <anhi@xxxxxxxxxxxxxxxx> wrote: > > I absolutetly agree that it's strange. The main reason for this is that > we have some collections of data files (some of them pretty large) that > can be compressed pretty effectively. At compilation time, it is decided > if the files are needed or not. If so, they are extracted. In the end, > the .tar.gz files are deleted since they are no longer needed. In > addition, once a user obtained a checkout, the whole thing is supposed > to work without a further net connection, so downloading the files > during build is not really an option. Maybe you can have a look to pristine-tar. From: http://kitenet.net/~joey/code/pristine-tar/ pristine-tar can regenerate a pristine upstream tarball using only a small binary delta file and a copy of the source which can be a revision control checkout. The package also includes a pristine-gz command, which can regenerate a pristine .gz file. The delta file is designed to be checked into revision control along-side the source code, thus allowing the original tarball to be extracted from revision control. --------------------- So you can recreate the tar from files in a git branch. Santi -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html