Terabytes of data: this is a lot of Oracle data to migrate. You would need a high performance tools capable to handle heterogeneous environment People suggested links here, so I will add some that could be very appropriate to your case: PostgreSQL loader is limited by the way. For instance, if you have a end of the line character within your data then load into PostgreSQL will fail. Check this pdf: http://www.wisdomforce.com/dweb/resources/docs/OracleToNetezzaWithFastReader.pdf Few tools to consider: FastReader: http://www.wisdomforce.com/dweb/index.php?id=23 - extracts data from Oracle into ASCII flat files or pipe and create a input for PostgreSQL loader. Many people use it for fast initial synchronization. Fastreader performs bulk data extract when terabytes of data can be migrated in hours Database Sync - http://www.wisdomforce.com/dweb/index.php?id=1001 - also fast data transfer tool that operates as a change data capture. It captures all the latest transactions and could be used for data warehouse incremental feeds with OLTP Oracle data. You may need it if don't want each time to move terabytes of data but only the changed data On Jan 11, 10:02 am, joshq...@xxxxxxxxx ("Josh Harrison") wrote: > Hi > We have an Oracle production database with some terbytes of data. We wanted > to migrate that to Postgresql (rigt now...a test database and not > production) database. > What are the good options to do that? > Please advise me on where to look for more information on this topic > thanks > josh ---------------------------(end of broadcast)--------------------------- TIP 5: don't forget to increase your free space map settings