> If you need more control then I suggest you switch from > file_get_contents() to using curl to retrieve the page. You can then set > your timeout criteria. Bingo! That appears to have done it. In fact, not only did it solve the timeout problem raised in http://marc.info/?l=php-general&m=125622597723923&w=2 but it also seems to have accidentally solved the separate problem of mysterious access violation errors raised in http://marc.info/?l=php-general&m=125601395731668&w=2 Apparently, not only does curl give me control over execution time of the function, but it seems to do its file access in a cleaner way somehow. I say this not only because the access violation errors have gone away, but also because the frequency of failed access attempts is much lower now than when I was using file-get-contents(). Thank you big time, Robert. If anyone would like to see, the function that I wrote to replace both file-get-contents() and simplexml_load_file() is below. For file-get-contents(), it can optionally do the job of an additional file-put-contents(). In the case of simplexml_load_file(), it works by passing the downloaded file as a string to simplexml_load_string(). If anyone sees any problems or weaknesses with how this is written, please do let me know. (Please be sure to include me in the recipients because I'm going to unsubscribe from the list now.) Thanks all for your help. ==================================================================== function GetFile($sURLin, $sFileOut) {// Get an online file and either return it as a string or write it to a file and return the file size. $oChandle = curl_init($sURLin); if($sFileOut == '') {$bWriteFile = false; curl_setopt($oChandle, CURLOPT_RETURNTRANSFER, true); curl_setopt($oChandle, CURLOPT_BINARYTRANSFER, true);} else {$bWriteFile = true; $oFilePtr = fopen($sFileOut, 'w'); curl_setopt($oChandle, CURLOPT_FILE, $oFilePtr);} curl_setopt($oChandle, CURLOPT_HEADER, 0); curl_setopt($oChandle, CURLOPT_CONNECTTIMEOUT, 5); $bGetIt = true; while($bGetIt) {$sRetVal = curl_exec($oChandle); if($sRetVal === false) {sWarning('DL error on ' . $sURLin); set_time_limit(30); sleep(1);} else {if($bWriteFile) $sRetVal = curl_getinfo($oChandle, CURLINFO_SIZE_DOWNLOAD); $bGetIt = false;} } curl_close($oChandle); if($bWriteFile) fclose($oFilePtr); return($sRetVal); } -----Original Message----- From: Robert Cummings [mailto:robert@xxxxxxxxxxxxx] Sent: Thursday, October 22, 2009 11:21 To: Marshall Burns Cc: php-general@xxxxxxxxxxxxx Subject: Re: Trapping failure of file_get_contents() Marshall Burns wrote: > Robert and others, > > I made that change in the code. It still does not trap the failure. I > believe the reason is that the script is timing out while > file_get_contents() is sitting there waiting for input. The problem is that > the function never returns, so there is no return value to check. What I > need is to get file_get_contents() to quit trying before the script timeout > limit is reached, so I can catch the failure and try again. If that cannot > be done, then I need to get the shutdown function to work. > > For reference, my original question is at > http://marc.info/?l=php-general&m=125622597723923&w=2 . If you need more control then I suggest you switch from file_get_contents() to using curl to retrieve the page. You can then set your timeout criteria. Cheers, Rob. -- http://www.interjinn.com Application and Templating Framework for PHP -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php