Re: Unbelievable : one single apache process uses more than whole server memory (5 gigabytes) !

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Le 18.08.2012 05:55, Brett Maxfield a écrit :

On 18/08/2012, at 6:46 AM, Denis BUCHER <dbucherml@xxxxxxxxxxxxx> wrote:

Dear all,

That's an unbelievable issue but we have single apache process that takes 5 GB of memory ! And it doesn't happens always with the same URLs, it's unpredictable and we don't understand why it is happening at all !!!

Any help would be greatly appreciated (as well as the users of the website and the owners) !

We even developed for days some software to analyze what is happening !

We do a pmap PID each second when the problems comes up, but the offending line is :

00002aad145c2000 2929376K rw---    [ anon ]

so what can we do with that, is there a way to know what it is ?

We also analyze open files with lsof -p.

I am now convinced that it only happens when the connection comes from Google Bot (????)

Do you have suggestions on how to analyze ?

Notes :

  • The config is Linux+Apache+PHP+Postgresql
  • Of course PHP memory_limit was the first point we checked but this is not the problem : grep memory_limit /etc/php.ini => memory_limit = 32M
  • By the way we tried rlimitmem but either it doesn't work, either we don't used it correctly, as it's no use : still using more than 5 gigabytes
  • That unanswered thread looks similar to our problem : http://serverfault.com/questions/161478/apache-process-consuming-all-memory-on-the-server

Thanks a lot for any help and/or suggestion !

Denis

Do you have a robots.txt ? Maybe you have some dynamic page that when spidered returns a very deep structure or some script that returns a temporary failure, causing a loop?

Look at the http log for googlebot, and see if there is any pattern to it. If you have the access logs it will tell you the script name at least.
If you find anything suspicious exclude it from spidering with robots.txt
If you don't maybe it is some sort of non-google bit attack, and you can block it with apache (as a hacker or snoop bot wont likely listen to robots.txt
In any case if you identify the path, maybe you can help identify / fix the bug..
Cheers
Brett

Dear Brett,

Thanks for your reply, but there is no pattern at all, Googlebots just seems to fetch "normal" pages on our webserver !

We have taken at least 150-200 hours analyzing logs and memory usage, and file usage, and the only pattern we found was that the client is Googlebot !

And whatever page it is, is there some explanation why apache can takes all the server memory ? PHP is limited, so how is it possible for Apache to do that ?

Under which circumstances is Apache capable of using so much memory, I mean why, where, how ?

Thanks a lot for any further help !


Denis

P. S. Our robots.txt file contains only two lines :

User-agent: *
Disallow: /newsite_temp2011_old/

 

 

[Index of Archives]     [Open SSH Users]     [Linux ACPI]     [Linux Kernel]     [Linux Laptop]     [Kernel Newbies]     [Security]     [Netfilter]     [Bugtraq]     [Squid]     [Yosemite News]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Samba]     [Video 4 Linux]     [Device Mapper]

  Powered by Linux