Hi,
I have four identical command-line PHP scripts running, and each will
frequently fetch some data from another server via
file_get_contents(). By frequently, I mean on average, every second.
Periodically, one of the processes (command-line PHP scripts), will
fail on file_get_contents(), with the error message:
PHP Warning: file_get_contents(http://.../): failed to open stream:
HTTP request failed!
Sometimes it's a single failure, other times, it fails repeatedly for
30-60 seconds, then starts working again. Strange, no?
At first, I thought maybe I've maxed out the server in question, but
I'm not. This problem happens on both servers that the scripts fetch
data from. And more significantly, while one process may fail at
file_get_contents(), the other processes (running identical code) on
the same box continue to execute the function (against the same
servers) without incident.
My question is, is there some resource in Mac OS X Server 10.4 (or PHP
5.2.4) that would limit a continuously running PHP script from
executing file_get_contents()? And to be clear, the failure doesn't
kill the script, and after the failure, it will start working again.
...Rene
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php