Re: How would you do this ?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Mon, September 25, 2006 8:26 am, Jad madi wrote:
> I'm building an RSS aggregator so I'm trying to find out the best way
> to
> parse users account feeds equally so Lets say we have 20.000 user with
> average of 10 feeds in account so we have about
> 200.000 feed
>
> How would you schedule the parsing process to keep all accounts always
> updated without killing the server? NOTE: that some of the 200.000
> feeds
> might be shared between more than one user

Before I spent another microsecond worrying about splitting my
processor time/speed, I'd write a test application with the most plain
and simple straightforward architecture to see if I need to worry
about splitting my processor time/speed. :-)

Just rip through all the feeds and do the job, and stop whining about
it :-) :-) :-)

More seriously, until you *know* you can't do it, or have metrics
showing what you can do, attempting to optimize is just plain silly.

If you *do* need to optimize, K.I.S.S.

Instead of ranking users and all that, just run through X% of each
user's feed, where X is a number you can control through an admin
page.

And maybe round "up" so if X% is 0.0001, you still get at least one
from each feed.

Definitely run this as a daemon or cron job or something similar, that
just keeps ripping through the feeds and starting over.

The overhead of the book-keeping you are thinking about will dwarf the
original problem you are trying to solve, which may not be a problem
anyway.

-- 
Like Music?
http://l-i-e.com/artists.htm

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux