Search squid archive

Re: Sudden but sustained high bandwidth usage

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 




Hi Amos,

It seems the "quick_abort_min -1 KB" did the trick. But I remember that "range_offset_limit" should overrule that.. isn't it?
Also, I saw people using -1 instead of "none" for range_offset_limit.. is it the same? :P

quick_abort_min -1 KB
acl wupdatecachable url_regex -i (microsoft|windowsupdate)\.com.*\.(cab|exe|ms[i|u|f]|dat|zip|psf|appx|appxbundle|esd)
range_offset_limit none wupdatecachable
refresh_pattern -i (microsoft|windowsupdate)\.com.*\.(cab|exe|ms[i|u|f]|dat|zip|psf|appx|appxbundle|esd) 483840 80% 483840 override-expire ignore-private ignore-no-store

Best Regards,

-- 
Heiler Bemerguy - (91) 98151-4894
Assessor Técnico - CINBESA (91) 3184-1751

Em 04/03/2016 01:01, Amos Jeffries escreveu:
On 4/03/2016 4:49 a.m., Heiler Bemerguy wrote:
Hi Amos,

You didn't notice it was always the same client ? The same IP address
redownloading ad eternum..

I managed to fix it by not caching stuff with "?" in it:

*refresh_pattern -i (/cgi-bin/|\?) 0 0% 0*

But I don't know if it's the best approach..
Provided you only added that refresh_pattern and not "cache deny" rules,
yes it is the best solution.
The refresh_pattern only applies to responses where there are missing
cacheability headers. So dynamic content which provides headers will
still be cached and served nicely.


The URL was like that:
/10.101.1.50 TCP_HIT/206 402 GET
//http://bg.v4.a.dl.ws.microsoft.com/dl/content/d/updt/2015/07/096c4bbc-4bc2-4ba1-8fd7-2e8cf3fb1937_132a7d6799d3bd625b0e5b375aa13552593bf0ed.appxbundle//?
- HIER_NONE/- application/octet-stream/

(After the "?" there were some variables)

Anyways, this isn't the cause of the ultra-high bandwidth load.. (*our
DL link is 100% used by squid right now!*). Most traffic comes from
windows updates...

/1457015568.658   9400 10.12.0.197 *TCP_SWAPFAIL_MISS/206* 1067290 GET
http://au.v4.download.*windowsupdate*.com/c/msdownload/update/software/crup/2015/02/publisher-x-none_08ccd79ac8a6bb475040360b6c9d8c9e1f258c9d.*cab
*- HIER_DIRECT/201.30.251.40 application/octet-stream//
//1457015624.067  36878 10.12.0.234 *TCP_MISS/206* 77842 GET
http://au.v4.download.*windowsupdate*.com/d/msdownload/update/software/crup/2014/02/windows8.1-kb2919355-x64_66955196a82751d1c8d9806d321487562b159f41.*psf
*- HIER_DIRECT/201.30.251.40 application/octet-stream//
//1457015750.556 126469 10.12.0.234 *TCP_MISS/206* 151183 GET
http://au.v4.download.*windowsupdate*.com/d/msdownload/update/software/crup/2014/02/windows8.1-kb2919355-x64_66955196a82751d1c8d9806d321487562b159f41.*psf
*- HIER_DIRECT/201.30.251.40 application/octet-stream//
//1457015753.263  11011 10.12.0.197 *TCP_MISS/206* 1616920 GET
http://au.v4.download.*windowsupdate*.com/c/msdownload/update/software/crup/2015/03/onenote-x-none_dd4f2bc75fc38be514c4009ce4d289e41f6b75d0.*cab
*- HIER_DIRECT/201.30.251.40 application/octet-stream//
//1457015780.978  13451 10.12.0.197 *TCP_SWAPFAIL_MISS/206* 2225824 GET
http://au.v4.download.*windowsupdate*.com/c/msdownload/update/software/crup/2015/03/onenote-x-none_dd4f2bc75fc38be514c4009ce4d289e41f6b75d0.*cab
*- HIER_DIRECT/201.30.251.40 application/octet-stream/

Do you see anything that could make it re-download over and over again
in this config?
The 206. If that is 206 from server Squid is unable to cache it for
future HITs.

/acl windowsupdate dstdomain .ws.microsoft.com
.windowsupdate.microsoft.com .update.microsoft.com .windowsupdate.com
.armdl.adobe.com//
//http_access allow windowsupdate//
//range_offset_limit none windowsupdate//
//
Can you try adding this:
  quick_abort_min -1 KB


Amos

_______________________________________________
squid-users mailing list
squid-users@xxxxxxxxxxxxxxxxxxxxx
http://lists.squid-cache.org/listinfo/squid-users

_______________________________________________
squid-users mailing list
squid-users@xxxxxxxxxxxxxxxxxxxxx
http://lists.squid-cache.org/listinfo/squid-users

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux