On 19-Nov-08, at 12:52 PM, Nathan Rixham wrote:
Rene Fournier wrote:
Hi,
I have four identical command-line PHP scripts running, and each
will frequently fetch some data from another server via
file_get_contents(). By frequently, I mean on average, every second.
Periodically, one of the processes (command-line PHP scripts), will
fail on file_get_contents(), with the error message:
first thing that springs to mind is some form of hardware
limitation, quite sure it's not php - could be a firewall with flood
protection (or even your own isp's anti malware set-up)
to combat it try binding the outgoing request to a random ip each
time (if you have multiple ip's on the box) [context: socket ->
bindto]
That could explain it, except that all the traffic is on the same LAN.
There's no firewall between Server A and Servers B and C.
next up (very unlikely) but possibly outgoing port conflict where
the previous local port is still closing whilst trying to be re-
opened.
That's interesting. I will look into that.
to get an ideal fix though you'll want to move away from
file_get_contents() as you're not doing things
Yes, I've also read that CURL is preferred to file_get_contents for
reasons of performance and security. I'm going to try that too.
the most efficient way; HTTP/1.1 allows you to keep a port open and
make multiple requests through the same socket/connection, simply
keep the socket open and don't send a connection: close header after
the request. (i say simply but you'll be needing to make you're own,
or find a good, http handler that allows you to write raw requests
and decode the raw http responses that come back)
best of luck; feel free to post your code incase anything jumps out
as obvious.
I will let you know how it goes. Thanks for the advice!
...Rene
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php