Search squid archive

Re: Connection breaks abnormaly while access some sites

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 4/4/06, Bin Liu <binliu.lqbn@xxxxxxxxx> wrote:
> Hi,
>
> I found that when squid fetching some web pages like
> http://en.beijing2008.com/, the connection to the original server
> always breaks when only one or two packets arrives. Squidclient
> diagnose shows only the first part of the html file arrives, after
> that, it seems that squid is waiting for something and then the client
> connection is reset. Here is the access.log:
>
>
>

Hmm, it is correct that the site is using a frontend accelerator;
this can be seen with :

 http://web-sniffer.net/?url=http%3A%2F%2Fen.beijing2008.com%2F&submit=Submit&http=1.0h&gzip=yes&type=GET&ua=Mozilla%2F5.0+%28Windows%3B+U%3B+Windows+NT+5.1%3B+en-US%3B+rv%3A1.8.0.1%29+Gecko%2F20060111+Firefox%2F1.5.0.1+Web-Sniffer%2F1.0.24

And look at the 'Via' header in the 'HTTP Response Header'  part of the output.

The thing is that the frontend uses http 1.1. to talk to the inner
'real' webserver.
I have seen that when HTTP 1.1 is selected in 'web-sniffer' it returns chunk
responses as method for transfer-encoding.

Although this header is NOT set when selecting HTTP 1.0 in web-sniffer
(as SQUID does); I tend to think that perhaps SQUID get's confused
because the frontend may start returning data to SQUID in chunks; since
it speaks http 1.1 with the real webserver, and there is the fact that  this
transfer encoding methos is being used , internally between the 2 servers
(frontend <-> realservers).

That's my theory on it.

M.


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux