Crawling/downloading a website to test permissions.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Okay, so a few minutes ago, I realized at least one folder on my
website that's supposed to be readable by visitors isn't... and that
got me wondering.

Is there a command I can run from the Linux terminal with my domain as
an arguement and it'll start at the homepage, go through all the links
and embedded images, and either generate a report of the content's
that's accessible or download everything preserving full paths that I
can then compare to an offline copy of the site or an ls -R thereof to
ensure everthing that's supposed to be reachable through normal
browsing is without having to manually follow every link?

_______________________________________________
Blinux-list mailing list
Blinux-list@xxxxxxxxxx
https://listman.redhat.com/mailman/listinfo/blinux-list




[Index of Archives]     [Linux Speakup]     [Fedora]     [Linux Kernel]     [Yosemite News]     [Big List of Linux Books]