Martin Alterisio wrote:
Maybe you can read the contents of the feeds using fsockopen() and
stream_set_timeout() to adjust the timeout, or stream_set_blocking()
to read it asynchronously, and then load the xml with
simplexml_load_string().
PS: I forgot to reply to all and mention you'll have to send the GET
http command and headers, check the php manual for examples.
You can just use fopen() to avoid all that.
eg.
$fd = fopen($url);
stream_set_blocking($fd,true);
stream_set_timeout($fd, 5); // 5-second timeout
$data = stream_get_contents($fd);
$status = stream_get_meta_data($fd);
if($status['timed_out']) echo "Time out";
else {
$xml = simplexml_load_string($data);
}
As for your caching, make sure you create the cache file atomically. So
how about this:
function request_cache($url, $dest_file, $ctimeout=60, $rtimeout=5) {
if(!file_exists($dest_file) || filemtime($dest_file) <
(time()-$ctimeout)) {
$stream = fopen($url,'r');
stream_set_blocking($stream,true);
stream_set_timeout($stream, $rtimeout);
$tmpf = tempnam('/tmp','YWS');
file_put_contents($tmpf, $stream);
fclose($stream);
rename($tmpf, $dest_file);
}
}
That takes the url to your feed, a destination file to cache to, a cache
timeout (as in, fetch from the feed if the cache is older than 60
seconds) and finally the request timeout.
Note the file_put_contents of the stream straight to disk, so you don't
ever suck the file into memory. You can then use a SAX parser like
xmlreader on it and your memory usage will be minimal. You will need
PHP 5.1.x for this to work.
You could also use apc_store/fetch and skip the disk copy altogether.
(untested and typed up during a long boring meeting, so play with it a bit)
-Rasmus
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php