On Wed, Jul 22, 2009 at 08:24:22PM +0200, Denis BUCHER wrote: > SELECT ... FROM ODBC source > foreach row { > INSERT INTO postgresql > } > > The problem is that this method is very slow... > > Does someone has a better suggestion ? Using COPY[1] is normally the preferred solution to getting data into PG fast. Some languages make this easier than others, if you can generate SQL that looks like: COPY table (col1,col2) FROM STDIN WITH CSV; 13,hello 42,"text with,comma" \. then you should be in luck---just bung this off to the ODBC driver as is and all should good. If you need to copy more than will fit in a string, arrange to put a few thousand rows in each batch, and generate them and insert them one-at-a-time in a transaction. Using tab-delimited mode (drop the WITH CSV) is possible, but most languages will provide library code for generating CSV files and hence will probably be easier to get correct. -- Sam http://samason.me.uk/ [1] http://www.postgresql.org/docs/current/static/sql-copy.html -- Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-general