Re: Retrieving large results from a database ends in memory error.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Use mysql_unbuffered_query()

RaTT wrote:
Hi Guys

I am trying to retrieve over 5000 rows from a mysql database, i have
to use a "SELECT *" query as i am required to use all the fields for
display.

Everytime i try to run the code below i get a Allowed memory size of
10485760 bytes exhausted (tried to allocate 40 bytes). Now if i change
php's memory limit in the php.ini file to something like 80MB then its
fine, but i can't be guarantied that the person who this code is for
will be able or willing to access their php.ini file.

Basically what i am asking is there a better way to write this query
so i can still retrieve all the results from the database without
running into memory issues ? should i break it up into chunks and put
it through a loop ?

Any assitance on retriving large amout of info from a db would be most
appreciated.

THis is the basic code i am using:

$q = "SELECT * FROM `users` ORDER BY UserID";
$r = mysql_query($q) or die("There has been a query error: ".mysql_error());
if($getR = mysql_fetch_array($r,MYSQL_ASSOC)){
       do {
               $userarray[] = $getR;
       }
       while($getR = mysql_fetch_array($r));
       echo 'Done retrieved '.count($getR).' records.';
}
else {
       echo 'there has been an error retrieving the results.';
}

Thanks Jarratt


-- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux