Mike Ginsburg escreveu:
Hello,
I am working on a personnel registry that has upwards of 50,000
registrants. Currently I am working on an export module that will
create a CSV from multiple tables. I have managed to keep the script
(PHP) under the memory limit
okay... some info needed here.
1. memory on the DB server
2. memory_limit on php.ini
when creating and inserting the CSV into the database. The problem
comes when I try to query for the data and export it. Memory limit is
a major concern, but the query for one row returns a result set too
large and PHP fails.
a single row is enough to crash PHP ?
I've thought about storing the data in multiple rows and then querying
one-by-one and outputting, but was hoping there was a better way.
if you can´t raise memory_limit, I think it´s the only way.
[]´s
ACV
---------------------------(end of broadcast)---------------------------
TIP 5: don't forget to increase your free space map settings