Search Postgresql Archives

explicit cursor vs. for loop in pl/pgsql

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I need to process a large table a few "chunks" at a time, commiting in between chunks so that another process can pick up and start processing the data.
 
I am using a pl/pgsql procedure with a "FOR rec in Select * from tab order by...." statement. The chunksize is passed in to the procedure, and in the FOR loop I iterate until I reach chunksize. The procedure then returns and the calling code issues the commit, etc.
 
I know from the documentation that the FOR implicitly opens a cursor, but I'm wondering if there would be any performance advantages to explicitly declaring a cursor and moving through it with FETCH commands?
 
I have to use the ORDER BY, so I imagine I'm taking the hit of processing all the records in the table anyway, regardless of how many I ultimately fetch. The nature of the data is that chunksize doesn't necessarily match up one-for-one with rows, so I can't use it as a LIMIT value.
 
The table in question gets inserted pretty heavily, and my procedure processes rows then deletes those it has processed. My main concern is to keep the processing fairly smooth, i.e., not have it choke on a select when the table gets huge.
 
Any suggestions appreciated!

- DAP
----------------------------------------------------------------------------------
David Parker    Tazz Networks    (401) 709-5130
 

 

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux