Search squid archive

Re: Squid stops handling requests after 30-35 requests

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi,

Upgraded squid to 3.1.20-2.2 from debian.org. Issue still persists.
Note: I have not disable the stats collection as mentioned in earlier mails.

Please suggest how to resolve this?

Thanks,
Bhagwat

On Fri, Nov 22, 2013 at 4:47 PM, Eliezer Croitoru <eliezer@xxxxxxxxxxxx> wrote:
> And what does this 503 page content??
> I do not know what the issue in hands is but there are couple things to
> first test before running into full debug or try to fix issues that might
> not exists.
>
> The version upgrade is there for a reason.
> I do know why an upgrade might not solve the issues but still if you have
> testing environment try to make sure what are the results with the latest
> 3.1.X branch which should be 3.1.21 if I am not wrong.
>
> It is very critical for you to test it.
> Since squid can run on many OS and many specs your logs are nice but not
> helping to understand the whole issue.
>
> There are many bugs that was fixed from the 3.1 list but I have used it for
> a very long time.
>
> If you need help installing 3.1.21 or any newer version I can try to assist
> you.
> Also it can be installed alongside another version.
>
> Best Regards,
> Eliezer
>
>
> On 21/11/13 09:38, Bhagwat Yadav wrote:
>>
>> Hi Eliezer/All,
>>
>> Thanks for your help.
>>
>> PFA log snippets.
>> Log1.txt is having sample 1 of cache.log in which you can find the time
>> gap.
>> Log2.txt is having sample 2 of client output and cache.log showing the
>> time gap.
>>
>> It seems that there is some in memory operation "StatHistCopy" which
>> is causing this issue, not sure though.
>>
>> Squid version is: Squid Cache: Version 3.1.6.
>>
>> Please let me know that if these logs are helpfull.
>>
>>
>> Thanks & Regards,
>>
>> On Wed, Nov 20, 2013 at 6:11 PM, Eliezer Croitoru <eliezer@xxxxxxxxxxxx>
>> wrote:
>>>
>>> Hey,
>>>
>>> Can you try another test?
>>> It is very nice to use wget but there are couple options that needs to be
>>> consider.
>>> Just to help you if was not there until now add: --delete-after
>>> to the wget command line.
>>>
>>> It's not related to squid but it helps a lot.
>>> Now If you are up to it I will be happy to see the machine specs and OS.
>>> Also what is "squid -v" output?
>>>
>>> Can you ping the machine at the time it got stuck? what about tcp-ping or
>>> "nc -v squid_ip port" ?
>>> we need to verify also in the access logs that it's not naukri.com that
>>> thinks your client is trying to covert it into a DDOS target.
>>> What about trying to access other resources?
>>> What is written in this 503 response page?
>>>
>>> Eliezer
>>>
>>>
>>> On 20/11/13 12:35, Bhagwat Yadav wrote:
>>>>
>>>>
>>>> Hi,
>>>>
>>>> I enable the logging but didn't find any conclusive or decisive logs
>>>> so that I can forward you.
>>>>
>>>> In my testing, I am accessing same URL 500 times in a loop from the
>>>> client using wget.
>>>> Squid got hanged sometimes after 120 requests ,sometimes after 150
>>>> requests as:
>>>>
>>>> rm: cannot remove `index.html': No such file or directory
>>>> --2013-11-20 03:52:37--http://www.naukri.com/
>>>> Resolvingwww.naukri.com... 23.72.136.235, 23.72.136.216
>>>> Connecting towww.naukri.com|23.72.136.235|:80... connected.
>>>>
>>>> HTTP request sent, awaiting response... 503 Service Unavailable
>>>> 2013-11-20 03:53:39 ERROR 503: Service Unavailable.
>>>>
>>>>
>>>> Whenever it got hanged, it resumes after 1 minute e.g in above example
>>>> after 03:52:37 the response came at 03:53:39.
>>>>
>>>> Please provide more help.
>>>>
>>>> Many Thanks,
>>>> Bhagwat
>>>
>>>
>>>
>




[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux