Re: Crawling/downloading a website to test permissions.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hello

There is at least the wget with it's recursive features. On the other hand if you have access to the web server logs you can look there any access errors.



--
mr. M01510 & guide Loadstone-GPS  Lat: 62.38718, lon: 25.64672
hkp://wwwkeys.pgp.net B784D020 fp:0C1F6A76 DC9DDD58 33838B5D 0E769600 B7840D02
http://sokkona.arimo.info


 Linux for blind general discussion kirjoitti
Subject: Crawling/downloading a website to test permissions.
Date: Sun, 3 Oct 2021 08:47:26
From: Linux for blind general discussion <blinux-list@xxxxxxxxxx>
To: Linux for blind general discussion <blinux-list@xxxxxxxxxx>

Okay, so a few minutes ago, I realized at least one folder on my
website that's supposed to be readable by visitors isn't... and that
got me wondering.

Is there a command I can run from the Linux terminal with my domain as
an arguement and it'll start at the homepage, go through all the links
and embedded images, and either generate a report of the content's
that's accessible or download everything preserving full paths that I
can then compare to an offline copy of the site or an ls -R thereof to
ensure everthing that's supposed to be reachable through normal
browsing is without having to manually follow every link?

_______________________________________________
Blinux-list mailing list
Blinux-list@xxxxxxxxxx
https://listman.redhat.com/mailman/listinfo/blinux-list


_______________________________________________
Blinux-list mailing list
Blinux-list@xxxxxxxxxx
https://listman.redhat.com/mailman/listinfo/blinux-list




[Index of Archives]     [Linux Speakup]     [Fedora]     [Linux Kernel]     [Yosemite News]     [Big List of Linux Books]