Re: Join runs for > 10 hours and then fills up >1.3TB of disk space

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I'm expecting 9,961,914 rows returned. Each row in the big table should have a corresponding key in the smaller tale, I want to basically "expand" the big table column list by one, via adding the appropriate key from the smaller table for each row in the big table. It's not a cartesion product join.



On May 16, 2008, at 1:40 AM, Richard Huxton wrote:

kevin kempter wrote:
Hi List;
I have a table with 9,961,914 rows in it (see the describe of bigtab_stats_fact_tmp14 below) I also have a table with 7,785 rows in it (see the describe of xsegment_dim below) I'm running the join shown below and it takes > 10 hours and eventually runs out of disk space on a 1.4TB file system

QUERY PLAN
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Merge Join (cost=1757001.74..73569676.49 rows=3191677219 width=118)

Dumb question Kevin, but are you really expecting 3.2 billion rows in the result-set? Because that's approaching 400GB of result-set without any overheads.

--
 Richard Huxton
 Archonet Ltd



[Postgresql General]     [Postgresql PHP]     [PHP Users]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Yosemite]

  Powered by Linux