Hello Amos, and thank you for the very very interesting reply! Even though that website is not very standard, is there a way/patch to permit squid regularly browse? The problem, with these kind of sites, is that users complaint about the fact, at their home, these websites are surfed well! Thank you again, Francesco > On Mon, 7 Mar 2011 13:43:52 +0100 (CET), Francesco wrote: >> Hello, >> >> i am experiencing some problems of zero sized reply even though i >> have >> upgraded to 3.1.8 version. >> >> For example, this is an example site: >> http://itinerari.mondodelgusto.it/eventi >> Trying this site without proxy it works. >> >> I have tried this workaround i found on the list: >> acl broken dstdomain .mondodelgusto.it >> request_header_access Accept-Encoding deny broken >> >> but it does not work... >> >> any ideas? >> >> Thank you! >> >> Francesco > > Confirmed. The website is attempting to do browser and client IP > sniffing. But the scripts seem to crash when processing the client IP > passed on by a proxy. > > This will happen with any proxy using the X-Forwarded-For header. It at > least produces a page when there is no such header, or when the header > contains the common "unknown". But as soon as anything other than > "unknown" is present it aborts the transaction. > > > Since it was browser sniffing I tried a few UA strings too. It seems > not to like anything strange in there either. Dying with a long hang > then "Your browser sent a request that this server could not > understand.". > The "Vary: User-Agent" statement that each UA type gets a unique reply > is bogus. The only change between page loads is an inlined advert, which > changes even if the same UA loads a page twice. > > > The "Vary: Host" statement that pages differing in domain name is worse > than useless. That is a basic assumption of HTTP being re-stated in a > way that merely slows down middleware processing the site. > > Amos > >