Re: Long Execution Time - Safe Mode

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I don't know if it will work but it definitely worth's a shoot.. try to
create a recursive call from your php page to the same php page, then in
each call make let's say umm 10 edits, using this method you will probably
be able to bypass the server's limitations, but I had never tested it so I
can't gurentee anything ;)

HTH,
Nitsan

On Fri, Mar 6, 2009 at 12:33 AM, haliphax <haliphax@xxxxxxxxx> wrote:

> On Thu, Mar 5, 2009 at 4:14 PM, Chris <dmagick@xxxxxxxxx> wrote:
> >
> > Firstly always cc the mailing list so others can add their own
> suggestions.
> >
> > Also please don't put your reply at the top, it makes it very hard to
> follow
> > what's going on. Put it underneath or inline (put comments after mine and
> > put more later on).
> >
> > ????? ???? wrote:
> >>
> >> Several problems....
> >>
> >> First, I don't have cron jobs either (Using OnlineCronJobs.com which
> >> limits me in the number of cron jobs). As I said, I am running the
> script
> >> every 8 hours.
> >
> > Does your host not support cron jobs? Find another host - or find another
> > cron provider that lets you run more frequently. There are others around.
> >
> >> If I'll "delete" the row from the db after *each* execution, then its
> 100
> >> queries per page excluding the queries that already exists - a lot of
> >> resources,
> >
> > No, it's not. A table with 100 rows is nothing, it's tiny. It takes
> longer
> > for you to read this than it does for a db to process 100 rows.
>
> I wrote a scraping program that ran from a shared server at one point.
> To get around the execution time limit (since I was at the mercy of
> connection speeds to the page being scraped) I had the script process
> X records, then forward the user's browser to the same script with
> parameters to instruct it to process the next X records.
>
> This was done in PHP browser mode, of course, and not CLI. I called
> the script using a scheduled task I had setup on my desktop PC that
> used cURL to kick the whole thing off.
>
> This is far and away one of the more ridiculous loops I've had to make
> in order to get around server limitations... but it worked. Anyway,
> Chris is right--100 rows of non-derived data is child's play for a
> RDBMS to churn out.
>
> If your server supports shell scripting (although I doubt it, if
> they're not letting you do cron jobs and they have safe_mode on) you
> could probably accomplish all of this with the mysql command-line
> tool.
>
> Just rambling at this point. Sorry. :D
>
>
> --
> // Todd
>
> --
> PHP General Mailing List (http://www.php.net/)
> To unsubscribe, visit: http://www.php.net/unsub.php
>
>

[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux