Dieter Bloms wrote:
Hi Amos,
On Sun, Mar 15, Amos Jeffries wrote:
I use an url_rewrite_program, which seems to die after about 400000
requests.
Squid starts 15 processes, which are enough, but after some time one
process after another die and at the end all processes where gone.
Is it possible to let squid restart an url_rewrite_program, when it dies ?
What version of Squid are you using that does not do this restart
automatically?
Squid only dies when ALL helpers for a needed service are dying too fast to
recover quickly.
I use squid 2.7.STABLE6.
I've 15 processes running, when I kill 2 of them, I see only 13 of 15
processes running in the cache manager menu.
like:
--snip from cache manager menu --
Redirector Statistics:
program: /usr/local/bin/webcatredir
number running: 13 of 15
requests sent: 2482
replies received: 2481
queue length: 0
avg service time: 3.33 msec
--snip--
for me it looks like the 2 killed processes will not be started, or does
it take some time ?
May take some time.
They are restarted if they are needed, up to the max limit of children.
Or at some point if they are noticed by other means.
IIRC the too-fast-to-recover ratio is 1 death per request handled or
less. (helper dies after its first request is fatal, or dies before
responding to first request is more fatal).
Amos
--
Please be using
Current Stable Squid 2.7.STABLE6 or 3.0.STABLE13
Current Beta Squid 3.1.0.6