Hi Atul,
Please try the code below. Execute all the statements in one transaction.
select * into new_table from old_table where type = 'abcz';
truncate table
old_table;
inesrt into
old_table select * from new_table;
On Thu, Dec 3, 2020 at 8:16 PM Atul Kumar <akumar14871@xxxxxxxxx> wrote:
Hi,
The feed_posts table has over 50 Million rows.
When I m deleting all rows of a certain type that are over 60 days old.
When I try to do a delete like this: it hangs for an entire day, so I
need to kill it with pg_terminate_backend(pid).
DELETE FROM feed_posts
WHERE feed_definition_id = 'bf33573d-936e-4e55-8607-72b685d2cbae'
AND created_at > '2020-05-11 00:00:00'
AND created_at < '2020-05-12 00:00:00';
So– I need help in figuring out how to do large deletes on a
production database during normal hours.
explain plan is given below
"Delete on feed_posts (cost=1156.57..195748.88 rows=15534 width=6)"
" -> Bitmap Heap Scan on feed_posts (cost=1156.57..195748.88
rows=15534 width=6)"
" Recheck Cond: ((created_at >= '2020-05-11 00:00:00'::timestamp
without time zone) AND (created_at <= '2020-05-12 00:00:00'::timestamp
without time zone))"
" Filter: (feed_definition_id = 'bf33573d-936e-4e55-8607-72b685d2cbae'::uuid)"
" -> Bitmap Index Scan on feed_posts_created_at (cost=0.00..1152.68
rows=54812 width=0)"
" Index Cond: ((created_at >= '2020-05-11 00:00:00'::timestamp without
time zone) AND (created_at <= '2020-05-12 00:00:00'::timestamp without
time zone))"
please help me on deleting the rows, Do I need to anything in postgres
configuration ?
or in table structure ?
Regards,
Atul
Regards,
Ravikumar S,
Ph: 8106741263