quoth the Robert Cummings: > > Why do you do this on every request? Why not have a cron job retrieve an > update every 20 minutes or whatnot and stuff it into a database table > for your page to access? Then if the cron fails to retrieve the feed it > can just leave the table as is, and your visitors can happily view > slightly outdated feeds? Additionally this will be so much faster that > your users might even hang around on your site :) This is a very interesting idea, but I am not sure if it is suitable for me at this point. First of all, one feed in particular can change in a matter of seconds, and I do want it to be as up to date as possible. Secondly, this is just for my personal site which is very low traffic, and it is not inconceivable that getting the feed every 20 minutes by cron would be _more_ taxing on the network than simply grabbing it per request... And to be fair, when everything is working as it should the feeds are retrieved in a matter of seconds, and I don't think it is annoying my users at all. It is the 0.5% of requests when the remote site is overloaded (or just plain down) that I want to provision for here. I do like this idea of caching the feed though. I think in my situation though, rather than prefetching the feed at regular intervals it may be better to cache the most recent request, and check the age of the cache when the next request comes. This way, I would not be needlessly updating it for those times when the page with my feeds goes for a few hours without a request. Of course, this still wouldn't solve my original problem. > Cheers, > Rob. > -- Thanks for your insight, -d -- darren kirby :: Part of the problem since 1976 :: http://badcomputer.org "...the number of UNIX installations has grown to 10, with more expected..." - Dennis Ritchie and Ken Thompson, June 1972
Attachment:
pgpdisradODQS.pgp
Description: PGP signature