Re: Downloading an Entire Website

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Luke Davis <ldavis@shellworld.net> writes:

> If any part of the site uses CGIs for anything, or uses any dynamicly
> generated content, this method will not work correctly.

Actually, wget saves those; in effect, you get a snapshot of the
website at that particular instant.

> You should get access to the back end of the site (the server) directly,
> either via FTP, Telnet, SSH, or the like, and tar.gz the site directly.

Now that's going to be a bit tough to do, and you still won't have
access to the database unless the web team is really, really nice. =)

Seriously, though, it may be worth seeing if they have a more
accessible format - some sites have RDF newsfeeds...
-- 
Sacha Chua <sacha@free.net.ph> - 4 BS CS Ateneo geekette
interests: emacs, gnu/linux, wearables, teaching compsci



_______________________________________________

Blinux-list@redhat.com
https://listman.redhat.com/mailman/listinfo/blinux-list

[Index of Archives]     [Linux Speakup]     [Fedora]     [Linux Kernel]     [Yosemite News]     [Big List of Linux Books]