Search squid archive

Re: Squid and MailScanner outside DMZ using Kaspersky

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Paul Welsh wrote:
I work for a company of 100+ people in the UK.  We use MS ISA 2004 running
SurfControl.  We use www.MessageLabs.com for email scanning and web content
scanning.  The web scanning works by pointing our ISA server to an upstream
proxy at MessageLabs.  This works well and has minimal administrative
overhead but it is rather expensive at about GBP5,000 per year.

There are various web scanning applications out there that sit on the ISA
server such as the one from Kaspersky labs -
http://www.kaspersky.com/anti-virus_ms_isa_server.  This will work out
significantly cheaper than using the Messagelabs web scanner.  However, I
worry about the performance and reliability of installing both this and
SurfControl on my ISA server.

Today I came across Kaspersky's Anti-Virus for Proxy Server which requires
Squid - http://www.kaspersky.com/anti-virus_linux_proxy_server.  Using this
on a Linux box and pointing the ISA server at it as an upstream proxy would
appear to get around my concerns about reliability and performance.

Having such a server might also allow me to install MailScanner -
www.mailscanner.info - with SpamAssassin and a couple of anti-virus products
and use it as a replacement for the MessageLabs mail scanning service.
Voila, 2 invoices killed with one server!

I have several questions:

1. Can the Squid server handle being a mail server too?  I'd invest in
something like a HP DL360 rackmount server with say a 3.x GHZ processor, 1
GB RAM and 2 x 70 GB or 140 GB disks in a RAID 10 configuration.  We're not
heavy mail users.

I'd give your proposed server more RAM, and question how you are going to manage a RAID 10 with only 2 disks, but for the given load (100 users, likely less than 20 requests/second peak) I would be surprised if the server you describe would have trouble.

2. Having thought about the network topology I am seriously considering
putting two NICs in the Squid server, one on the DMZ of the ISA server and
the other on the Internet using one of our spare public IPs. This would get
around what I see as a potential performance issue of the ISA server passing
requests for web sites to the Squid server over the DMZ connection and the
Squid server then passing the same request to the Internet via the DMZ port
of the ISA server.  Does this make sense, or am I exaggerating the
performance hit on the ISA server and would be better off just putting the
Squid server on the DMZ with a single NIC and using rules on the ISA server
to allow it access to the Internet etc?  Bear in mind the Squid server will
be used for SMTP too so I'd need to permit incoming SMTP via the ISA server,
etc.

DMZ usually indicates a separate firewall is involved. What I get from the above is that your ISA server is directly connected to the internet, is acting as a gateway for all client non-http requests (which it proxies) and as a mail server. Assuming that's the case, then passing the traffic past said server twice is not going to be much of a performance hit. It's an odd networking choice, and will add a little bit of latency, but it's not likely going to be noticeable. If you can swing it, it seems like it would be a better idea to enlist the Squid server in this role (since it's filtering all incoming HTTP and email traffic) or putting in a dedicated firewall.

3. How about if I give the Squid server its own high speed ADSL connection?
I'd do this to conserve bandwidth on our expensive leased line (bandwidth
needed for incoming requests to our web servers).  In this scenario, which
is a likely change within the next few months, I believe I'd need to put a
2nd NIC in the Squid server and pass all web requests over that 2nd card to
the ADSL connection with web page requests from the ISA server going over
the DMZ.  Does this make sense?  Clearly, the Squid server would need to run
firewall software or use simple port forwarding on the ADSL router.

So you'd have a leased line for incoming requests that pass through the ISA server (are they cached by the ISA server or just passed through?) and an ADSL line for outgoing web requests from the Squid server? That wouldn't be too hard, and you would even be able to set up routing on a Linux or BSD box to prefer sending traffic over the DLS line, but use the leased line as backup.

4. I could simply leave things as they are.  The current system works fine
and the company can afford the GBP5k or so per year that we currently pay.
By taking web page scanning and mail scanning in-house I get administrative
hassle and end up relying on one server rather than utilising the hundreds
of servers and human resources that a company like MessageLabs has to draw
on.

Always a good option. If it's not broke, don't fix it. How many problems would you have to encounter to burn up GBP5k in a year (don't forget the cost of hardware, power for the new server, additional cooling requirements, hardware maintenance, etc.)? Would the benefits of flexibility outweigh the hassle of managing this setup? Do you have enough expertise in house to maintain the setup?

Thanks for reading this far and I welcome any comments or advice.



Chris

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux