----- Original Message -----
From: "Jochem Maas" <jochem@xxxxxxxxxxxxx>
Dan Brown (the nice guy on this list, not the twat that wrote the 'daVinci
Code') suggests
a *much* better way to go- namely using a CLI script. the fun part is
getting a button
push on an admin page to somehow initiate the CLI script.
one way of doing this could be to have a 'job' table in your database to
which 'jobs'
are inserted (e.g. 'do my 30000 record import') and that your [CLI] script
checks the
database to see if it should start a 'job' and just exit if it does not
need to do so
... lastly in order to have the [CLI] script regularly check if it needs
to do something you
can use cron to schedule that the script runs at regular intervals (e.g.
every 15 minutes)
many ways to skin this cat - my guess is all the decent ways of doing it
will involve a
CLI script probably in conjunction with a cronjob.
if you arent familiar w/ ajax, no worries; the main concept in my
suggestion
is
sending a number of requests rather than a single request. that way you
can
execute a fraction of the queries on each request. this will ensure that
you dont
hit the maximum execution time.
Just to comment an alternative on how to break the job. It is just something
that happened to me once and might be useful.
My particular job could be naturally broken in several stages. Actually, it
had to. Though it could be solved with a huge complex SQL query with
several joins and subqueries (which were not available), it could also be
perfomed with the help of a few intermediate auxiliary tables. So I had a
query (which would have been the sub-query) inserting records into a flat
table with no indexes. On a second step I added the index, then there was
another join in between this auxiliary table and another table (and this one
was pretty complex and at that time, with no stored procedures, it required
some processing with PHP) and the final step that produced the result. At
that time, before AJAX was popular, I showed the progress on an iframe on
which I changed the "src" attribute for each successive step (poor man's
AJAX), but it could also be done via AJAX or reloading the whole page
instead of an iframe within it.
Some time later I tried to redo one other such process into a single query
with subqueries and I found that using the auxiliary tables was faster. I
admit that my attempt was half-hearted, I wanted to either see a big
improvement or ignore the whole business. The improvement wasn't that big
so I dropped it and assumed the other processes would not show any big
improvement either. After all, I knew the data and optimized it as much as
possible so I can't assume the SQL optimizer could do much better than I
had.
Satyam
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php