Re: [users@httpd] Why does Apache use up all my memory?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Thanks, Joe and Jon for your helpful thoughts regarding my Apache
memory problem.  Here's some more information:

Joe> > 1-0    15823    W     0.00    1742573500    GET /out/388.mp3
Joe> > 2-0 15824 W 0.00 1742573499 GET /out/238.mp3
Joe>
Joe> Are these all simple static files, or is /out/ handled by some CGI
Joe> script etc?

Joe, you're right - they do get passed through a Perl script for
processing.  However, unless I'm mistaken, I don't THINK the following
code would produce the kind of problems I'm seeing:

my $filesize = (stat($sermonfile))[7];
print "Content-Disposition: inline;filename=$sermonfile_short\n";
print "Content-Length: $filesize\n";
print "Content-Type: application/octet-stream\n\n";
open(SERMONFILE, $sermonfile);
binmode(SERMONFILE);
binmode(STDOUT);
my $chunk;
while (read(SERMONFILE,$chunk,1024)) {
   print $chunk;
}
close SERMONFILE;

But even in the worst case, where a bad script reads in the entire 18M MP3
file into memory, that STILL wouldn't seem to account for my 2Gig
memory loss...

------------------------------------

Jon> If your clients are downloading 18Mb files over slow links they
Jon> may keep trying the connection, breaking the original therefore
Jon> leaving you with multiple connections to the same file from the
Jon> same client.

Jon, in this particular case I don't think that's happening.  I
generally don't have very long to test stuff because once the memory is
exhausted by Apache proceses, my SSH connections to the server slow to
a crawl, and even logging in at the console becomes nearly impossible.

I was fortunate enough to catch this occurrence before the memory was
completely used up.  As the RAM got dangerously low, I decided to shut
down all my Apache processes and waited about 2 minutes just to see
what the memory looked like.  Without Apache running, I was back to
about 1.6 Gig free.) Then, while keeping an eye on "top" and the
/server-status page, I started Apache again.

Immediately the visitor's download accelerator program began hitting my
site again.  Within seconds all 20 Apache processes were in use (by
the one guy), and before 2 minutes had elapsed, I was forced to
shutdown all Apache processes as my total memory fell below 50M.  I was
able to quickly grab those "top" and "/server-status" shots just before
I killed all the Apache clients.

So in my case, I don't think the problem is half-closed connections or
timeouts.  Under the right circumstances of heavy downloading, a virgin
Apache server can exhaust my 2Gig of memory in less than 2 minutes.

Jon> Your 20 concurrent connections are limited by MaxClients. I assume
Jon> you are keeping this small because of the size they are growing to
Jon> as you should be able to get to approx 175-200 of the top of my
Jon> head using prefork with 1Gb of memory. I would have thought this
Jon> would max out pretty quickly with many 18Mb downloads as they will
Jon> take time.

Yes, normally I would have MaxClients set to something larger.  Setting
it to 20 has been something of a desperation measure, trying to keep
the memory usage under control.  Apparently it's not working.



---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@xxxxxxxxxxxxxxxx
  "   from the digest: users-digest-unsubscribe@xxxxxxxxxxxxxxxx
For additional commands, e-mail: users-help@xxxxxxxxxxxxxxxx



[Index of Archives]     [Open SSH Users]     [Linux ACPI]     [Linux Kernel]     [Linux Laptop]     [Kernel Newbies]     [Security]     [Netfilter]     [Bugtraq]     [Squid]     [Yosemite News]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Samba]     [Video 4 Linux]     [Device Mapper]

  Powered by Linux