Search squid archive

Re: downloads fail after a while in 3.1.0.14

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Wed, 2 Dec 2009 15:10:44 -0800 (PST), Landy Landy
<landysaccount@xxxxxxxxx> wrote:
> Right now I'm trying to download a file from intel thats 432MB and it
has
> failed 5 times already. It started and failed at 110MB later started it
> again and failed at 19MB and so on. The best it has done is 110MB.
> 
> I don't know whats going on. This time squid won't cache this one since
I
> have maximum_object_size 256 MB

I expect this explains the second failure at 19MB. If Squid were able to
cache the object it would I think get to over 110MB before failing again.

I suspect one of the timeouts (request or read) is set too low for your
bandwidth speeds.

That and/or quick_abort may be clashing to do this.

It could also be the old enemies of broken ECN and TCP window sizing.

> 
> I would increase this just for the moment to 512MB to test if it
downloads
> it.
> 

Are you able to catch the information between the client browser and
squid?

Running this on the squid box should log it to a file "$CLIENTIP.log" for
detailed analysis:
 tcpdump -i $ETH -s 0 -w $CLIENTIP.log host $CLIENTIP

NP: substituting $CLIENTIP for the client IP address and $ETH with the
interface name Squid talks to the client through (ie. eth0, eth1, etc).

Amos

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux