Re: time taking deletion on large tables

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 





On Dec 3, 2020, at 9:45 AM, Atul Kumar <akumar14871@xxxxxxxxx> wrote:

The feed_posts table has over 50 Million rows.

When I m deleting all rows of a certain type that are over 60 days old.

When I try to do a delete like this:  it hangs for an entire day, so I
need to kill it with pg_terminate_backend(pid).

Delete the records in batches. I have used this approach many times successfully for large tables that are highly active on live production systems. 

You’ll have to find the correct batch size to use for your dataset while keeping the run time short; i.e. 30 seconds.  Then repeatedly call the function using a script — I’ve used a perl script with the DBI module to accomplish it. 

i.e. 

create or replace function purge_feed_post (_purge_date date, _limit int default 5000)
  returns int 
as 
$$
  declare 
    _rowcnt int;
  begin
    create temp table if not exists purge_feed_post_set (
      feed_post_id int
      )
    ;

    /* Identify records to be purged */
    insert into purge_feed_post_set (
      feed_post_id
      )
      select feed_post_id
      from feed_posts 
      where created_at < _purge_date
      order by created_at
      limit _limit
    ;
    
    /* Remove old records */
    delete from feed_posts using purge_feed_post_set 
      where feed_posts.feed_post_id = purge_feed_post_set.feed_post_id
    ;

    get diagnostics _rowcnt = ROW_COUNT;

    delete from purge_feed_post_set;

    return _rowcnt;
  end;
$$ language plpgsql
  set search_path = public
;


[Index of Archives]     [KVM ARM]     [KVM ia64]     [KVM ppc]     [Virtualization Tools]     [Spice Development]     [Libvirt]     [Libvirt Users]     [Linux USB Devel]     [Linux Audio Users]     [Yosemite Questions]     [Linux Kernel]     [Linux SCSI]     [XFree86]

  Powered by Linux