Search squid archive

How to resolve: "Excess data from ..." error

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Dear All,

I got a lot of the following logged messages in my cache.log file:

2010/05/10 19:23:59| httpReadReply: Excess data from "GET http://ff.search.yahoo.com/gossip?command=..."; 2010/05/10 19:24:00| httpReadReply: Excess data from "GET http://ff.search.yahoo.com/gossip?command=...";

(Sorry, I snipped of the search string and replace with the 3 dots: ... confidentiality)

Is there a way to resolve that?

Or, if not, I'd like to know if we can make Squid behave in pass-through mode for certain domains, such as search.yahoo.com? In other words, can we config an ACL for certain domains such that: Squid just forwards a request from the client straight to the origin server without parsing the request and saying if there is "Excess data..." or not?

I also observe these:

2010/05/10 19:34:36| WARNING: unparseable HTTP header field {GET / HTTP/1.1}
2010/05/10 19:24:00| clientTryParseRequest: FD 5013 (snipped client_ip:port) Invalid Request

Any fixes too?

I am running Squid-2.7.STABLE9 on Fedora 12 (kernel-2.6.32.11-99.fc12.x86_64).

Appreciate for any help.

Thanks & regards,
Khem


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux