Mridula Mahadevan <mmahadevan@xxxxxxxxxxxx> writes: > I am running a bunch of queries within a function, creating some temp tables and populating them. When the data exceeds say, 100k the queries start getting really slow and timeout (30 min). when these are run outside of a transaction(in auto commit mode), they run in a few seconds. Any ideas on what may be going on and any postgresql.conf parameters etc that might help? I'll bet the function is caching query plans that stop being appropriate once the table grows in size. You might have to resort to using EXECUTE, although if you're on 8.4 DISCARD PLANS ought to help too. regards, tom lane -- Sent via pgsql-performance mailing list (pgsql-performance@xxxxxxxxxxxxxx) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-performance