Search Postgresql Archives

Re: Batch update million records in prd DB

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi Michael,

Thank you for your reply

We found that each loop take time is different, it will become slower and slower, as our table is big table and join other table, even using index the last 1000 records take around 15 seconds, will it be a problem? Will other concurrent update have to wait for 15 second until lock release?

Thanks and best regards 

Michael Lewis <mlewis@xxxxxxxxxxx> 于2021年2月24日周三 下午11:47写道:
Of course it will impact a system using that table, but not significant I expect and the production system should handle it. If you are committing like this, then you can kill the script at any time and not lose any work. The query to find the next IDs to update is probably the slowest part of this depending on what indexes you have.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]

  Powered by Linux