Search squid archive

url length limit

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi all - I am using an array of squid servers to accelerate dynamic
content, running 2.6.22 and handling a daily average of about 400
req/sec across the cluster.  We operate diskless and enjoy a great hit
rate (>80%) on very short-lived content.

About 50+ times per day, the following appears in my cache.log:

squid[735]: urlParse: URL too large (4738 bytes)
squid[735]: urlParse: URL too large (4470 bytes)
squid[735]: urlParse: URL too large (4765 bytes)
...

I understand that Squid is configured at compile time to cut off URLs
larger than 4096 bytes, as defined by MAX_URL in src/defines.h, and that
changing this has not been tested.  Nevertheless, since I am expecting
very long URLs (all requests are long query strings, responses are
SOAP/XML), and the ones getting cutoff are not severely over the limit,
I would like to explore this change further.

Has anyone redefined MAX_URL in their squid setups?  Do these 'URL too
large' requests get logged?  If not, is there a way I could get Squid to
tell me what the requests were so that I can verify that we have an
operational need to increase the URL limit?

Thanks in advance


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux