Search Postgresql Archives

Re: Out of memory error on select

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Tue, 2005-04-05 at 16:04, Werner Bohl wrote:
> I have a fairly large table (21M) records. One field of type varchar(16)
> has some duplicate values, which I'm trying to identify.
> Executing select dup_field from dup_table group by dup_field having
> count(*) > 1 errs with Out of Memory error. Server has 4GB memory, the
> backend-process errs after 3.7GB consumed. Is there any work-around that
> I may use to get this duplicates?
> 
> Explain output:
> "HashAggregate  (cost=881509.02..881510.02 rows=200 width=20)"
> "  Filter: (count(*) > 1)"
> "  ->  Seq Scan on lssi_base  (cost=0.00..872950.68 rows=1711668
> width=20)"
> 
> Why is the hash eating so much memory? A fast calc of the memory
> occupied by this data is less than 512MB.

Have you run analyze across this table?  It looks like either you
haven't or the query planner is making a mistake about how many rows it
expects to get from this.

HashAggregate chews through memory pretty fast, and it best used for
smaller sets, so it's usually a mistake when the planner picks it for
large ones.

---------------------------(end of broadcast)---------------------------
TIP 3: if posting/reading through Usenet, please send an appropriate
      subscribe-nomail command to majordomo@xxxxxxxxxxxxxx so that your
      message can get through to the mailing list cleanly

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux