Massive Inserts Strategies

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I have a database with foreign keys enabled on the schema. I receive different 
files, some of them are huge. And I need to load these files in the database 
every night. There are several scenerios that I want to design an optimal 
solution for -

1. One of the file has around 80K records and I have to delete everything from 
the table and load this file. The provider never provides a "delta file" so I 
dont have a way to identify which records are already present and which are 
new. If I dont delete everything and insert fresh, I have to make around 80K 
selects to decide if the records exist or not. Now there are lot of tables 
that have foreign keys linked with this table so unless I disable the foreign 
keys, I cannot really delete anything from this table. What would be a good  
practise here?

2. Another file that I receive has around 150K records that I need to load in 
the database. Now one of the fields is logically a "foreign key" to another 
table, and it is linked to the parent table via a database generated unique 
ID instead of the actual value. But the file comes with the actual value. So 
once again, I have to either drop the foreign key, or make 150K selects to 
determine the serial ID so that the foreign key is satisfied. What would be a 
good strategy in this scenerio ?

Please pardon my inexperience with database !

Thanks,
Amit


[Postgresql General]     [Postgresql PHP]     [PHP Users]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Yosemite]

  Powered by Linux