Re: How would you do this ?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Jad madi wrote:
I'm building an RSS aggregator so I'm trying to find out the best way to
parse users account feeds equally so Lets say we have 20.000 user with
average of 10 feeds in account so we have about
200.000 feed

How would you schedule the parsing process to keep all accounts always
updated without killing the server? NOTE: that some of the 200.000 feeds
might be shared between more than one user

Cache the feeds for a period of time.

So a (very) basic process would look like:

- does cache exist? How long has it been around?

- Shorter than "X" minutes?
-- Serve up the cached version. Dies off quickly.

- Longer than "X" minutes?
-- See if there is a new version, if there is update the cache, if there isn't update the cache timestamp (so it won't check again for another "X" minutes).

Of course you'd have to build in stuff like the feed can't be reached (site down, dns problems, whatever)..

--
Postgresql & php tutorials
http://www.designmagick.com/

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux