Re: PHP cron job optimization

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Sat, Sep 10, 2011 at 1:47 PM, Sean Greenslade <zootboysean@xxxxxxxxx> wrote:
> On Sat, Sep 10, 2011 at 4:35 AM, muad shibani <muad.shibani@xxxxxxxxx>wrote:
>
>> I want to design an application that reads news from RSS sources.
>> I have about 1000 RSS feed to collect from.
>>
>> I also will use Cron jobs every 15 minutes to collect the data.
>> the question is: Is there a clever way to collect all those feed items
>> without exhausting the server
>> any Ideas
>> Thank you in advance
>>
>
> Do them one at a time. Fetching web pages isn't a particularly taxing job
> for any decent server.
>
> My advice is just to try it. Even if you solution isn't 'elegant' or
> 'clever,' if it works and doesn't bog down the server, that's a win in my
> book.
>
> --
> --Zootboy
>
> Sent from my PC.
>

Traversing the internet is one of the most expensive IO operations
that can be performed.  The OP should try and split the work into
several processes and utilize async connections via a mechanism like
curl_multi [1].  Doing 1,000 connections one at a time is abysmal if
you're blocked for even 3 seconds per connection.

I'd work by making a two step process that has one cron job for
constantly fetching/updating feeds into a store & then another process
that constantly checks for updates inside aforementioned store.

[1] http://us3.php.net/manual/en/function.curl-multi-add-handle.php

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux