I am about to do some maintenance on a web site I did not write that appears to be hundreds of files of entirely static html. One of the things I would like to do is parse the index.html file and follow all the links to find all the broken links
I use checkbot to do this; I run it as a weekly cron job so I know when sites that I link to go bad. There are other similar programs. freshmeat.net might be a good place to start looking, depending on your platform preferences.
, all the unused files, and get a graph of the site structure.
Not sure about those things. (I wrote a perl script years ago to generate a site map by following links; you can see the result at http://chezphil.org/map/. This isn't what you want though as it relies on annotating the source to hint at the structure.)
> I am interested in a browser compatibility assessment.Not sure about that either, but it's an interesting question as to whether you can automate browser compatibility testing to any significant extent.
--Phil. --------------------------------------------------------------------- The official User-To-User support forum of the Apache HTTP Server Project. See <URL:http://httpd.apache.org/userslist.html> for more info. To unsubscribe, e-mail: users-unsubscribe@xxxxxxxxxxxxxxxx " from the digest: users-digest-unsubscribe@xxxxxxxxxxxxxxxx For additional commands, e-mail: users-help@xxxxxxxxxxxxxxxx