On 06/11/11 16:42, Jakub Narebski wrote:
Jonathan Fine<jfine@xxxxxxxxx> writes:
Hi
This it to let you know that I'm writing (in Python) a script that
places the content of CTAN into a git repository.
https://bitbucket.org/jfine/python-ctantools
I hope that you meant "repositories" (plural) here, one per tool,
rather than putting all of CTAN into single Git repository.
There are complex dependencies among LaTeX macro packages, and TeX is
often distributed and installed from a DVD. So it makes sense here to
put *all* the content of a DVD into a repository.
Once you've done that, it is then possible and sensible to select
suitable interesting subsets, such as releases of a particular package.
Users could even define their own subsets, such as "all resources needed
to process this file, exactly as it processes on my machine".
In addition, many TeX users have a TeX DVD. If they import it into a
git repository (using for example my script) then the update from 2011
to 2012 would require much less bandwidth.
Finally, I'd rather be working within git that modified copy of the ISO
when doing the subsetting. I'm pretty sure that I can manage to pull
the small repositories from the big git-CTAN repository.
But as I proceed, perhaps I'll change my mind (smile).
I'm working from the TeX Collection DVDs that are published each year
by the TeX user groups, which contain a snapshot of CTAN (about
100,000 files occupying 4Gb), which means I have to unzip folders and
do a few other things.
There is 'contrib/fast-import/import-zips.py' in git.git repository.
If you are not using it, or its equivalent, it might be worth checking
out.
Well, I didn't know about that. I took a look, and it doesn't do what I
want. I need to walk the tree (on a mounted ISO) and unpack some (but
not all) zip files as I come across them. For details see:
https://bitbucket.org/jfine/python-ctantools/src/tip/ctantools/filetools.py
In addition, I don't want to make a commit. I just want to make a ref
at the end of building the tree. This is because I want the import of a
TeX DVD to give effectively identical results for all users, and so any
commit information would be effectively constant.
CTAN is the Comprehensive TeX Archive Network. CTAN keeps only the
latest version of each file, but old CTAN snapshots will provide many
earlier versions.
There was similar effort done in putting CPAN (Comprehensive _Perl_
Archive Network) in Git, hosting repositories on GitHub[1], by the name
of gitPAN, see e.g.:
"The gitPAN Import is Complete"
http://perlisalive.com/articles/36
[1]: https://github.com/gitpan
This is really good to know!!! Not only has this been done already, for
similar reasons, but github is hosting it. Life is easier when there is
a good example to follow.
I'm working on putting old CTAN files into modern version
control. Martin Scharrer is working in the other direction. He's
putting new files added to CTAN into Mercurial.
http://ctanhg.scharrer-online.de/
Nb. thanks to tools such as git-hg and fast-import / fast-export
we have quite good interoperability and convertability between
Git and Mercurial.
P.S. I'd point to reposurgeon tool, which can be used to do fixups
after import, but it would probably won't work on such large (set of)
repositories.
Thank you for the pointer to reposurgeon. My approach is a bit
different. First, get all the files into git, and then 'edit the tree'
to create new trees. And then commit worthwhile new trees.
As I recall the first 'commit' to the git repository for the Linux
kernel was just a tree, with a reference to that tree as a tag. But no
commit.
P.P.S. Can you forward it to comp.text.tex?
Done.
--
Jonathan
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at http://vger.kernel.org/majordomo-info.html