Dear All,
I am trying to change some specific columns on some tables in prod and looking at the best ways to do it with very minimal downtime.
So when i try running alter table table_name alter column column_name type INT; , it is hanging and starts providing a lock on the table.
I have this option in mind with but with a doubt.
I have this option in mind with but with a doubt.
- Stop postgres and application connections
- take a schema dump of the specific tables (it takes few seconds)
- take a complete dump of those table or take just a data dump only (-a option) -- (This takes 3 to 5 mins)
- drop all the specific tables from prod DB
- restore the schema dump
- run the script to change from BIGINT to INT
- Then restore the data. Since the data will take more than 30 mins to restore, i wanted to allow the application to start accessing the DB while the restore is ongoing.
So my question is will it be a good idea to bring back up the DB with applications connecting and same time restoring the data from those specific tables?
So my question is will it be a good idea to bring back up the DB with applications connecting and same time restoring the data from those specific tables?
If this is not a good idea...any suggestions on how i can make these changes to the LIVE DB with minimal downtime?
Many thanks
Ndoh
Ndoh