Is the entire record data, or does is consist of javascript fluff and other furniture which might be better driven by client side DHTML processing ?
Also with 2500 records I would make a more efficient search algorithm, or page your output to say 25 pages of 100 records each. If your query is often returning the same data, cache the output as a file on the server instead, and update the file only when you apply updates to the database.
Cheers - Neil.
At 22:11 21/12/2003 +0000, you wrote:
From: "Robin Kopetzky" <sparkyk@xxxxxxxxxxxxxxxxx> To: "PHP DB Group" <php-db@xxxxxxxxxxxxx> Date: Sun, 21 Dec 2003 15:15:35 -0700 Message-ID: <HCEBLNDFHDPOOELMIHOMMEIICOAA.sparkyk@xxxxxxxxxxxxxxxxx> MIME-Version: 1.0 Content-Type: text/plain; charset="iso-8859-1" Content-Transfer-Encoding: 7bit Subject: PHP/DB speed
Good afternoon!
I am writing a project and have a speed concern...
The code I am using is thus and is retrieving around 2,500 records:
$result = mysql_query($sql) while ($row = mysql_fetch_array($result)) { build <OPTION> stmt }
Is there a faster method? Timed this with microtime and .9 seconds to retrieve the data and output the web page seems REAL slow. Now this is on a T-Base-100 network but I imagine it would be like watching paint dry in a 56K modem. Any thoughts, ideas on accelerating this? I did try ob_start() and ob_end_flush() and no help...
Thank for any help in advance.
Robin 'Sparky' Kopetzky Black Mesa Computers/Internet Service Grants, NM 87020
-- PHP Database Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php