Search squid archive

Persistent connections w/ transparent proxy

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I'm new to squid but have a problem...

I have toiled with this for quite a while but I can't figure out why squid won't keep persistent connections to my clients. Currently, squid is configured for transparent use... My clients are not configured for any proxy, however our checkpoint firewall is using a "http_mapped" rule to redirect port 80 traffic to the squid box on port 8080. It WORKS, just not as efficiently as it should.

Currently, I'm on squid 3.0, although I've tried 2.7 and 3.1 - all exhibit the same behavior. Here are some relevant sections of my squid.conf:
----------------------------------------
http_port 8080 transparent
client_persistent_connections      on  #Tried toggling this, no change
server_persistent_connections      on  #Tried toggling this, no change
persistent_connection_after_error  on  #Tried toggling this, no change
pipeline_prefetch                  on  #Tried toggling this, no change
-------------------------------------- ---

Basically, the client makes an HTTP 1.1 GET request for a page but doesn't explicitly specify Connection: keep-alive. This shouldn't matter though as 1.1 connections are persistent by default (or so I read). Squid then relays the GET to the destination server and specifies Connection: keep-alive. The server responds to the request with 200 OK and also specifies Connection: keep-alive. (So, at this point, everything is going well.) When squid relays the 200 OK back to my client, the headers contain Connection: close, then squid sends a TCP FIN packet to tear down the TCP tunnel.

I've captured numerous samples with WireShark. Below are the headers for an example conversation. This isn't a one-off due to a server redirect or something, this same pattern applies to any kind of get for images, text, css, whatever. I've shortened some of the lines to be cleaner.
----------------------------------------------------------
GET / HTTP/1.1
Host: espn.go.com
Accept-Encoding: gzip
Accept: application/xml..
User-Agent: Mozilla..
Accept-Language: en-US..
Accept-Charset: utf-8..


      GET / HTTP/1.0
      Host: espn.go.com
      Accept-Encoding: gzip
      Accept: application/xml..
      User-Agent: Mozilla..
      Accept-Language: en-US
      Accept-Charset: utf-8..
      Via: 1.1 my-squid (squid/3.0.STABLE21)
      X-Forwarded-For: 10.1.1.1
      Cache-Control: max-age=259200
      Connection: keep-alive


      HTTP/1.1 200 OK
      Cache-Control: no-cache
      Connection: Keep-Alive
      Date: Wed, 30 Dec 2009 21:32:22 GMT
      Pragma: no-cache
      Content-Type: text/html; charset=iso-8859-1
      Last-Modified: Wed, 30 Dec 2009 21:32:14 GMT
      Accept-Ranges: bytes
      Server: Microsoft-IIS/6.0
      Cache-Expires: Wed, 30 Dec 2009 21:32:24 GMT
      Content-Length: 31365
      Vary: Accept-Encoding
      Content-Encoding: gzip


HTTP/1.0 200 OK
Cache-Control: no-cache
Date: Wed, 30 Dec 2009 21:32:22 GMT
Pragma: no-cache
Content-Type: text/html; charset=iso-8859-1
Last-Modified: Wed, 30 Dec 2009 21:32:14 GMT
Accept-Ranges: bytes
Server: Microsoft-IIS/6.0
Cache-Expires: Wed, 30 Dec 2009 21:32:24 GMT
Content-Length: 31365
Vary: Accept-Encoding
Content-Encoding: gzip
X-Cache: MISS from my-squid
Via: 1.0 my-squid (squid/3.0.STABLE21)
Connection: close
----------------------------------------------------------

The clients I'm proxying are coming from a network with very high latency. It's *extremely* costly for the clients to setup and tear down a TCP connection for every page element. Can someone explain why squid isn't keeping the connections alive to my clients?

Thanks for any help...


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux