Have you tried including some cache control directives in your web pages?
<META HTTP-EQUIV="cache-Control" CONTENT="no-cache"> <META HTTP-EQUIV="Pragma" CONTENT="no-cache">
Solid point, Sean... but in trying to be a nice netizen, I realize that my pages /are/ static and thus can safely be cached for significant periods of time. I rarely update stuff more than once (maybe twice) a week, so setting no-cache would be pretty unfriendly on my part, I think.
Full spec of cache control directives can be found here:
http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.9
Hey, thanks. I'll go learn something...
I don't know but i wonder if specifying a public anonymous proxy in your browser would bypass your ISP's proxy? Maybe the same issues exist even if you're directing requests to an external proxy of your choice?
Transparent proxies will intercept ANYTHING on port 80. Only if you run that outside proxy on a non-standard high port would this work.
Two of the suggestions so far are going to prove particularly effective, working entirely around the problem:
1. Use lynx or links on an outside machine on another ISP network, or tunnel HTTP through SSH and use the browsers at home through that box anyway.
2. Set up a self-signed certificate, and check the site using HTTPS since nobody in their right mind would cache secure pages.
Those two ought to take care of me quite nicely, so I've learned some good lessons here today. Thanks to everyone.
-- Rodolfo J. Paiz rpaiz@xxxxxxxxxxxxxx http://www.simpaticus.com
-- redhat-list mailing list unsubscribe mailto:redhat-list-request@xxxxxxxxxx?subject=unsubscribe https://www.redhat.com/mailman/listinfo/redhat-list