Hi all, I was trying to find some way to implement multithreading into my postgreSQL stored functions. The thing is, that I have data stored in multiple tables - for each day one table - and I want to write a function which selects data from these tables and stores them into files (or just returns the data in cursor); Information about data stored in these tables are in another table defined like this: partitions_daily(from_date timestamp, to_date timestamp, table_name) From_date and to_date tells in which table the requested data should be found. In this time I have function through which I can get the data, the problem is that everything is processed with only one CPU core. I think it should be very easy to make this model to be multithreaded and use more CPU cores. I was trying to implement this with plperlU but there is problem with using spi in multiple threads. There is also possibility to do multithreading with some external scripts to get the data, but this is no exactly what I want. Can someone please help me with this? How can I force postgreSQL to use multiple CPU cores to run my function. If somebody can give me an advice or post some simple peace of code It would be great. Thank you for all replies, Lukas Houf -- Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-general