Search Postgresql Archives

Insert into on conflict, data size upto 3 billion records

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi, 

I'm looking for suggestions on how I can improve the performance of the below merge statement, we have a batch process that batch load the data into the _batch tables using Postgres and the task is to update the main target tables if the record exists else into it, sometime these batch table could go up to 5 billion records. Here is the current scenario

target_table_main has 700,070,247  records and is hash partitioned into 50 chunks, it has an index on logical_ts and the batch table has 2,715,020,546 close to 3 billion records, so I'm dealing with a huge set of data so looking of doing this in the most efficient way.

Thank you

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]

  Powered by Linux