On Sat July 8 2006 13:00, Juhana Sadeharju wrote: > 20 MB vs 13.5 GB makes me think the wiki is not of the best > technology. The download software could be better as well. I > will mail my suggestions to wget list. TWiki has a "publish" plugin (I think I've mentioned it before) that dumps a static, rendered version of all its pages with links rewritten in a way that makes sense, and optionally zips it (or maybe I added the zipping part to my local copy, I can't remember.) Unfortunately, TWiki has also been swiss cheese security wise in my experience, but with mediawiki's prominence I would think that someone must have written something like a "publish" plugin by now to do the same thing. I don't think the problem with mirroring wikis can really be fixed with a patch to wget.... it's the nature of dynamically generated content that a tool designed to retrieve static content will retrieve way more data than it needs to. The same would apply if you were trying to mirror a web forum or a more traditional CMS. Rob