Re: Need an idea to operate massive delete operation on big size table.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Wed, 2025-01-15 at 20:23 +0530, Gambhir Singh wrote:
> I received a request from a client to delete duplicate records from a table which is very large in size. 
> 
> Delete queries (~2 Billion) are provided via file, and we have to execute that file in DB.
> Last time it lasted for two days. I feel there must be another way to delete records in an efficient manner
> 
> This kind of activity they do every month.

I don't think there is a better way - except perhaps to create a new copy of
the table and copy the surviving rows to the new table.  Than may win if you
delete a majority of the rows.

For the future, you could consider not adding the duplicate rows rather than
deleting them.  Perhaps a constraint that prevents the duplicates can help.

Yours,
Laurenz Albe






[Index of Archives]     [Postgresql Home]     [Postgresql General]     [Postgresql Performance]     [Postgresql PHP]     [Postgresql Jobs]     [PHP Users]     [PHP Databases]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Databases]     [Yosemite Forum]

  Powered by Linux