Search squid archive

Re: Squid downloading same files many times?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Elvar wrote:


Elvar wrote:

Hello,

Recently I discovered that one of my clients' bonded T1 links to the internet was getting maxed out for up to an hour at a time. After doing some research and using Sawmill to generate some usage statistics on the user surfing activity, I found that some users were downloading 1.5 gigs worth of Adobe Reader and other various downloads. Adobe Reader is obviously not that large and I know the user would just keep downloading it over and over on purpose. I'm wondering if somehow Squid is repeatedly downloading these files and if so, why? I've had squid running at this place for a long time and I've never seen anything like this. It's a real problem though because when it happens, the bandwidth is just leveled.


The exact URL which was being accessed the most recent time is http://ardownload.adobe.com/pub/adobe/reader/win/8.x/8.1.3/enu/AdbeRdr813_en_US.msi <javascript:;> .

Is anyone else experiencing something like this lately?


Thanks,
Elvar



Sorry about that, I accidentally forgot to send in plain text. The URL again is

http://ardownload.adobe.com/pub/adobe/reader/win/8.x/8.1.3/enu/AdbeRdr813_en_US.msi


Could it be that the update is failing and continuously retrying?

Have the following been adjusted from their stock settings:

quick_abort_min
quick_abort_max
range_offset_limit
maximum_object_size

It sounds like the client is requesting the file in sections, but Squid is grabbing the whole thing. Since it is a 34MB download, it might be larger than your maximum_object_size (defaults to 4MB), and as such is not being cached.

Elvar

Chris


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux