Search squid archive

Re: Squid stops handling requests after 30-35 requests

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hey,

Can you try another test?
It is very nice to use wget but there are couple options that needs to be consider.
Just to help you if was not there until now add: --delete-after
to the wget command line.

It's not related to squid but it helps a lot.
Now If you are up to it I will be happy to see the machine specs and OS.
Also what is "squid -v" output?

Can you ping the machine at the time it got stuck? what about tcp-ping or "nc -v squid_ip port" ? we need to verify also in the access logs that it's not naukri.com that thinks your client is trying to covert it into a DDOS target.
What about trying to access other resources?
What is written in this 503 response page?

Eliezer

On 20/11/13 12:35, Bhagwat Yadav wrote:
Hi,

I enable the logging but didn't find any conclusive or decisive logs
so that I can forward you.

In my testing, I am accessing same URL 500 times in a loop from the
client using wget.
Squid got hanged sometimes after 120 requests ,sometimes after 150 requests as:

rm: cannot remove `index.html': No such file or directory
--2013-11-20 03:52:37--http://www.naukri.com/
Resolvingwww.naukri.com... 23.72.136.235, 23.72.136.216
Connecting towww.naukri.com|23.72.136.235|:80... connected.
HTTP request sent, awaiting response... 503 Service Unavailable
2013-11-20 03:53:39 ERROR 503: Service Unavailable.


Whenever it got hanged, it resumes after 1 minute e.g in above example
after 03:52:37 the response came at 03:53:39.

Please provide more help.

Many Thanks,
Bhagwat





[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux