In addition to what Amos already answered:
Yes, a URL rewritor like squidGuard can block HTTPS sites.
But the URL that the URL rewritor receives is only the domain
name which makes managing lists of URLs a little more complicated.
ufdbGuard is a fork of squidGuard and actively maintained by me
since 2005.
ufdbGuard has additional features for safer HTTPS traffic:
it probes HTTPS sites, and has configuration options to
block HTTPS if they have incorrect SSL certificates,
are used for Skype chat, are used for SSH tunnels,
blocks UltraSurf etc.
ufdbGuard is a multithreaded daemon, much faster than squidGuard
and can enforce SafeSearch for many search engines.
ufdbGuard works with free and a commercial URL database.
Marcus
Brent Norris wrote:
List,
I currently have squid setup as an interception proxy in my school
district. I also have it configured on our static network machines.
I understand that squid will not work as an interception proxy for
anything that isn't standard HTTP, according to documentation available
on the web.
What I was wondering though is if there was a way that I could set
my Linux server up to accept other kinds of traffic (HTTPS, Streaming
media) and pass that traffic on without really proxying it, but still
comparing it against my squidguard lists?
I do a lot of filtering of objectionable sites for our students in
squidguard and it would be a very big hole to all those sites through if
the students are using HTTPS to get to them.
I am not really set in any specific way. If someone has a better
idea about how I should go about it, please feel free to give me any
pointers that you might have.
Thanks for any info that you can provide.
Brent