Re: Git import of the recent full enwiki dump

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Sat, Apr 17, 2010 at 02:19, Sverre Rabbelier <srabbelier@xxxxxxxxx> wrote:

> Assuming you do the import incrementally
> using something like git-fast-import (feeding it with a custom
> exporter that uses the dump as it's input) you shouldn't even need an
> extraordinary machine to do it (although you'd need a lot of storage).

I am using a Python script [1] to import the XML dump.


> Speaking of which, it might make sense to separate the
> worktree by prefix, so articles starting with "aa" go under the "aa"
> directory, etc?

Very good idea. What command would I need to send to
git-fast-import to do that?


> Hope that helps, and if you do convert it (and it turns out to be
> usable, and you decide to keep it up to date somehow), put it up
> somewhere! :)

It did.
I will make it available if it turns out to be useful. Keeping it up to
date might be harder unless they keep on releasing new
(incremental) snapshots.


Thanks,
Richard


[1] http://github.com/scy/levitation/blob/master/import.py
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]