Re: file_get_contents for URLs?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



> Hey all,

Hello.

> I'm doing some maintenance work on an existing system and there is a piece
> of code that uses file_get_contents() to read data from a URL, which is fine
> in theory I suppose.
>
> But the problem is sometimes the server where that URL lives is not
> available, and the system hangs indefinitely.
>
> Shouldn't this be done with curl, and if so can it be done so that the call
> will time out and return control back when the server is not available?

Looking at the docs alone, it looks like you can pass a stream as the
third argument to file_get_contents(). So create a stream, set the
timeout on that (using stream_context_create() &
stream_context_set_option() ), and then pass it to
file_get_contents().

-- 
Richard Heyes

HTML5 Canvas graphing for Firefox, Chrome, Opera and Safari:
http://www.rgraph.net (Updated March 28th)

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux