Search squid archive

Re: How to resolve: "Excess data from ..." error

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Khemara Lyn wrote:
Dear All,

I got a lot of the following logged messages in my cache.log file:

2010/05/10 19:23:59| httpReadReply: Excess data from "GET http://ff.search.yahoo.com/gossip?command=..."; 2010/05/10 19:24:00| httpReadReply: Excess data from "GET http://ff.search.yahoo.com/gossip?command=...";

(Sorry, I snipped of the search string and replace with the 3 dots: ... confidentiality)

Is there a way to resolve that?

Or, if not, I'd like to know if we can make Squid behave in pass-through mode for certain domains, such as search.yahoo.com? In other words, can we config an ACL for certain domains such that: Squid just forwards a request from the client straight to the origin server without parsing the request and saying if there is "Excess data..." or not?

No "excess data" is a poisoning attack. Letting it past results in serious security breaches affecting all users of the proxy.

Whether intentional or not it is a corruption of reply data in the HTTP connection and need to be fixed at the source of the problem.


I also observe these:

2010/05/10 19:34:36| WARNING: unparseable HTTP header field {GET / HTTP/1.1} 2010/05/10 19:24:00| clientTryParseRequest: FD 5013 (snipped client_ip:port) Invalid Request

Any fixes too?

Now that error, if you look at the trace should be fixable. Simply find out which header is broken and figure out if the fix needs to be done in in the client software or Squid (usually its a broken java applet or dynamic web page).


I am running Squid-2.7.STABLE9 on Fedora 12 (kernel-2.6.32.11-99.fc12.x86_64).

Appreciate for any help.

Thanks & regards,
Khem


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE9 or 3.1.3

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux