Search Postgresql Archives

Re: Large Result and Memory Limit

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Mike Ginsburg escreveu:
André Volpato wrote:
Mike Ginsburg escreveu:
Hello,
I am working on a personnel registry that has upwards of 50,000 registrants. Currently I am working on an export module that will create a CSV from multiple tables. I have managed to keep the script (PHP) under the memory limit
okay... some info needed here.
1. memory on the DB server
2. memory_limit on php.ini
PHP Memory Limit is 16M. We're running multiple installations on a single webserver, so memory is a concern
DB Server is separate from the webserver.
when creating and inserting the CSV into the database. The problem comes when I try to query for the data and export it. Memory limit is a major concern, but the query for one row returns a result set too large and PHP fails.
a single row is enough to crash PHP ?
Well the "data" field in the table (text) contains 50K lines. It's over 30M in size for the full export.



I've thought about storing the data in multiple rows and then querying one-by-one and outputting, but was hoping there was a better way.
if you can´t raise memory_limit, I think it´s the only way.
I was afraid that would be the answer.


Well, you could try to retrieve data with substr(), say 10k lines in 5 queries.
It´s kinda ugly, but should work in this case...

[]´s
ACV



---------------------------(end of broadcast)---------------------------
TIP 2: Don't 'kill -9' the postmaster


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux