Erik Labree wrote:
Davide Bianchi <davide <at> onlyforfun.net> writes:Erik Labree wrote:To generate those local file we need to have a tool/script that looks at the directory structure of the webapplication and does requests to the pages it can see and save each page as plain text in the same directory structure as the webapp to the file system. Are there already tools that do this or otherwise any input on how to do thiswould be greatly appriciated.Every browser can do it. Otherwise there are 'spiders' that can crawl and download an entire site. DavideThanks Davide, 'Spiders' seem like a good way to go because it needs to be automated.Anyoneany experience with them?Erik
wget is a GNU utility available with most Linux distributions that can act as a spider to download and store an entire web site.
See: http://www.gnu.org/software/wget/wget.html John --------------------------------------------------------------------- The official User-To-User support forum of the Apache HTTP Server Project. See <URL:http://httpd.apache.org/userslist.html> for more info. To unsubscribe, e-mail: users-unsubscribe@xxxxxxxxxxxxxxxx " from the digest: users-digest-unsubscribe@xxxxxxxxxxxxxxxx For additional commands, e-mail: users-help@xxxxxxxxxxxxxxxx