I am looking for the help to minimise the time taken by the pg_basebackup utility.
As informed Earlier we are taking the backup of the database using pg_basbackup utility using below command.
$PGHOME/bin/pg_basebackup -p 5433 -U postgres -P -v -x --format=tar --gzip --compress=6 --pgdata=- -D /opt/backup_db
According to our previous discussion, pg_basebackup is not depend on any of the postgresql configuration parameters. If I go for gzip format we need to compromise on time.
We are planning to take by following below steps. Please correct me if I am wrong.
- Identify the larger indexes(whose size is above 256MB) and drop those indexes. Due to this size of the database will reduce.
- Take the backup of the database.
- Recreate the indexes on the environment where we created the environment which we created using the backup.
I am new to postgres database. Could you help me to construct the query to drop and create the indexes, please?