Re: Slow query with a lot of data

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 




Am 21.08.2008 um 19:08 schrieb Merlin Moncure:

On Thu, Aug 21, 2008 at 11:07 AM, Moritz Onken <onken@xxxxxxxxxxxxxxxx> wrote:

Am 21.08.2008 um 16:39 schrieb Scott Carey:

It looks to me like the work_mem did have an effect.

Your earlier queries had a sort followed by group aggregate at the top, and now its a hash-aggregate. So the query plan DID change. That is likely
where the first 10x performance gain came from.

But it didn't change as I added the sub select.
Thank you guys very much. The speed is now ok and I hope I can finish tihs
work soon.

But there is another problem. If I run this query without the limitation of the user id, postgres consumes about 150GB of disk space and dies with

ERROR: could not write block 25305351 of temporary file: No space left on
device

After that the avaiable disk space is back to normal.

Is this normal? The resulting table (setup1) is not bigger than 1.5 GB.

Maybe the result is too big.  if you explain the query, you should get
an estimate of rows returned.  If this is the case, you need to
rethink your query or do something like a cursor to browse the result.

merlin

There will be a few million rows. But I don't understand why these rows
bloat up so much. If the query is done the new table is about 1 GB in size.
But while the query is running it uses >150GB of disk space.

moritz



[Postgresql General]     [Postgresql PHP]     [PHP Users]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Yosemite]

  Powered by Linux