Thodoris wrote:
Stewart Duncan wrote:
Hi there,
I'm having some serious problems with the PHP Data Object functions.
I'm trying to loop through a sizeable result set (~60k rows, ~1gig)
using a buffered query to avoid fetching the whole set.
No matter what I do, the script just hangs on the PDO::query() - it
seems the query is running unbuffered (why else would the change in
result set size 'fix' the issue?). Here is my code to reproduce the
problem:
<?php
$Database = new PDO(
'mysql:host=localhost;port=3306;dbname=mydatabase',
'root',
'',
array(
PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION,
PDO::MYSQL_ATTR_USE_BUFFERED_QUERY => true
)
);
Don't you want it the other way around? You want it unbuffered so it
doesn't get retrieved in mysql, send the whole result set to php then
you can use it.
You want to start using it immediately - so make it unbuffered.
Either way if the result set is going to be large your MySQL's memory or
PHP's memory may exceed. So if you use either you may need to fine-tune
PHP by increasing the per process memory (memory_limit in php.ini) or
MySQL.
The point of unbuffered queries is that they only take up memory while
transferring a particular row's data. It won't get the whole lot from
mysql, store it all in php (taking up memory for all x rows in the
result set), then start working on it.
So it won't take up that much memory - you'll only ever have one row of
data in memory. Of course, the downside is that you can't do another
query until the whole result has been processed.
Personally I'd never try to work on 60k items at once, I'd chunk my
results (work on 1000 items at a time) and go from there, but that of
course depends on what needs to be done.
--
Postgresql & php tutorials
http://www.designmagick.com/
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php