Re: Exporting TBs of data in Oracle schema/database into CSV format to migrate into PostgreSQL

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi,
On Mon, Apr 20, 2020 at 12:31 PM Scott Ribe <scott_ribe@xxxxxxxxxxxxxxxx> wrote:
So, the big question is: how much downtime can you afford on the current Oracle DB?

Yes, you should investigate the most efficient way to transfer data, but with TBs it will still be significant time. If you can afford the down time, great--kill all other connections to Oracle and go for it. If not, that's more complicated since you'll have to figure out a way to track and apply changes that happen during the copy...
If downtime is a concern, then, start a transaction on PG, using isolation level - REPEATABLE READ, get the SCN number from Oracle and copy the existing data from each table (table by table) using FDW for the initial data load. And then, if you have Oracle GG license, start the replicat from that SCN so that the data gets to sync eventually. 
And then, when all the tables are in Sync, go ahead and switch your Apps. If we dont have GG license, we need to look for some other extensions or Kafka or something else that could do this job. Remember, migration is always challenging when lots of data is involved, and there never exists a direct magic. 


--
Regards,
Avinash Vallarapu
+1-902-221-5976

[Index of Archives]     [KVM ARM]     [KVM ia64]     [KVM ppc]     [Virtualization Tools]     [Spice Development]     [Libvirt]     [Libvirt Users]     [Linux USB Devel]     [Linux Audio Users]     [Yosemite Questions]     [Linux Kernel]     [Linux SCSI]     [XFree86]

  Powered by Linux