Thanks for your response, I've made this request : SELECT query_string, DAY.ocu from search_data.query_string, (SELECT SUM(occurence) as ocu, query FROM daily.queries_detail_statistics WHERE date >= '2006-01-01' AND date <= '2006-01-30' AND portal IN (1,2) GROUP BY query ORDER BY ocu DESC LIMIT 1000) as DAY WHERE DAY.query=id; and after few minutes, i've error "out of memory" with this execution plan : Nested Loop (cost=8415928.63..8418967.13 rows=1001 width=34) -> Subquery Scan "day" (cost=8415928.63..8415941.13 rows=1000 width=16) -> Limit (cost=8415928.63..8415931.13 rows=1000 width=12) -> Sort (cost=8415928.63..8415932.58 rows=1582 width=12) Sort Key: sum(occurence) -> HashAggregate (cost=8415840.61..8415844.56 rows=1582 width=12) -> Seq Scan on queries_detail_statistics (cost=0.00..8414056.00 rows=356922 width=12) Filter: ((date >= '2006-01-01'::date) AND (date <= '2006-01-30'::date) AND (((portal)::text = '1'::text) OR ((portal)::text = '2'::text))) -> Index Scan using query_string_pkey on query_string (cost=0.00..3.01 rows=1 width=34) Index Cond: ("outer".query = query_string.id) (10 rows) if HashAgg operation ran out of memory, what can i do ? thanks a lot martial > martial.bizel@xxxxxxx writes: > > I've error "out of memory" with these traces : > > Doing what? > > > AggContext: -1976573952 total in 287 blocks; 25024 free (414 chunks); > > -1976598976 used > > DynaHashTable: 503439384 total in 70 blocks; 6804760 free (257 chunks); > > 496634624 used > > I'd guess that a HashAgg operation ran out of memory ... > > regards, tom lane > > ---------------------------(end of broadcast)--------------------------- > TIP 1: if posting/reading through Usenet, please send an appropriate > subscribe-nomail command to majordomo@xxxxxxxxxxxxxx so that your > message can get through to the mailing list cleanly >