Search Postgresql Archives

Re: Large Result and Memory Limit

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 10/4/07, Mike Ginsburg <mginsburg@xxxxxxxxxxxxxxxxxxxxxxx> wrote:

>  This is for the export only.  Since it is an export of ~50,000 registrants,
> it takes some time to process.  We also have load balanced web servers, so
> unless I want to create identical processes on all webservers, or write some
> crazy script to scp it across the board, storing it as a text file is not an
> option.  I realize that my way of doing it is flawed, which the reason I
> came here for advice.  The CSV contains data from approximately 15 tables,
> several of which are many-to-ones making joins a little tricky.  My thought
> was to do all of the processing in the background, store the results in the
> DB, and allowing the requester to download it at their convenience.
>
>  Would it be a good idea to create a temporary table that stored all of the
> export data in it broken out by rows and columns, and when download time
> comes, query from their?

Yeah, I tend to think that would be better.  Then you could use a
cursor to retrieve then and serve them one line at a time and not have
to worry about overloading your php server.

---------------------------(end of broadcast)---------------------------
TIP 1: if posting/reading through Usenet, please send an appropriate
       subscribe-nomail command to majordomo@xxxxxxxxxxxxxx so that your
       message can get through to the mailing list cleanly

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux