Re: Out of memory running 560 MB query

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Sorry, I was running the query on what appeared to be a cached copy of the old table—or something.

 

After a reboot, everything seems to be working. The admin client wasn’t responding but now all of my data is there.

 

With regard to the list, I wasn’t sure whether there was a .conf setting that limited the size of queries or whether there was a query size limit.

 

—Morgan

 

 

From: Rosser Schwarz [mailto:rosser.schwarz@xxxxxxxxx]
Sent: Sunday, September 14, 2014 5:28 PM
To: Morgan Ramsay
Cc: PgSQL ADMIN
Subject: Re: Out of memory running 560 MB query

 

On Sun, Sep 14, 2014 at 5:18 PM, Morgan Ramsay <morgan.ramsay@xxxxxxxxx> wrote:

Any idea how to successfully run a query that large? What’s the size limit of a query?

 

Without seeing your actual query (or a syntactically similar one, if there are private details you can't disclose), it's difficult to say.  In general, though, a couple of thoughts come to mind:

 

Can you run it as a single COPY statement, instead of however many individual INSERTS?  Failing that, can you use the multi-row VALUES syntax for your INSERT?

 

As an aside, this type of question is probably more suited for the pgsql-sql list, rather than -admin.

 

rls

 

--
:wq


[Index of Archives]     [KVM ARM]     [KVM ia64]     [KVM ppc]     [Virtualization Tools]     [Spice Development]     [Libvirt]     [Libvirt Users]     [Linux USB Devel]     [Linux Audio Users]     [Yosemite Questions]     [Linux Kernel]     [Linux SCSI]     [XFree86]

  Powered by Linux