Search Postgresql Archives

Re: Backup

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



This is great, thank you so much! 

On Thu, Oct 17, 2024 at 12:47 AM Asad Ali <asadalinagri@xxxxxxxxx> wrote:

Hi Andy,

I hope you're doing well. Based on your inquiry about PostgreSQL backups for your 100GB historical database with images, here are some suggestions that should help you achieve compressed, efficient backups without running into storage issues.

1. Use Custom Format with Compression
A more efficient option would be to use the custom format (-Fc) with compression. You can also adjust the compression level and make use of your machine's multiple CPUs by using parallel jobs:

pg_dump -Fc -Z 9 -j 4 --blobs -f /path/to/backup/file.dump your_database_name
  • -Fc: Custom format (supports compression and flexible restore options).
  • -Z 9: Maximum compression level (0-9 scale).
  • -j 4: Number of parallel jobs (adjust based on CPU cores).
  • --blobs: Includes large objects (important for your images).
This approach should give you a smaller backup file with faster performance.

2. Splitting Backups into Parts
If you're concerned about running out of storage space, consider splitting the backup by table or schema, allowing more control over the backup size:

pg_dump -Fc --table=my_large_table -f /path/to/backup/my_large_table.dump your_database_name
pg_dump -Fc --table=other_table -f /path/to/backup/other_table.dump your_database_name

This can be helpful when you archive different tables or sections of data.

3. External Compression Tools
If you need additional compression, you can pipe the pg_dump output through an external compression tool like gzip:

pg_dump -Fc --blobs your_database_name | gzip > /path/to/backup/file.dump.gz

This should further reduce the size of your backups.

4. Consider Alternative Backup Methods
- Explore other backup methods like `pgBackRest` or `WAL-E`. These tools are specifically designed for PostgreSQL backups and offer features like incremental backups and point-in-time recovery
pgbackrest --stanza=your-database --type=full --compress-type=zst --compress-level=6 --process-max=4 backup

- You can use pg_basebackup for PostgreSQL backups, but it has limitations compared to tools like pgBackRest. While pg_basebackup is easy to use and built-in with PostgreSQL, it is primarily designed for physical backups (base backups) and doesn't offer as many advanced features such as incremental backups, sophisticated compression, or parallelism.

However, it does support basic compression and can be used for full backups.

pg_basebackup -D /path/to/backup/directory -F t -z -Z 9 -P -X stream
  • -D: The destination directory for the backup.
  • -F t: Specifies the tar format for the backup, which is required for compression.
  • -z: Compresses the output.
  • -Z 9: Compression level (0–9, where 9 is the highest).
  • -P: Shows the progress of the backup.
  • -X stream: Includes the WAL files needed to make the backup consistent (important for recovery).
pg_basebackup -D /backups/db_backup -F t -z -Z 9 -P -X stream
This command will take a full physical backup of the database, compress the output using gzip, and store the backup in a tarball.

5. Automating Backups
Since you need monthly backups, I recommend automating this process with a cron job. For example, you can set this up to run on the 1st of every month at 2 AM:

0 2 1 * * /usr/bin/pg_dump -Fc -Z 9 -j 4 --blobs -f /path/to/backup/file.dump your_database_name

6. Monitoring Disk Usage &  Backup Performance
Finally, it's important to monitor your available storage. You can either ensure you have enough free space or consider moving older backups to external or cloud storage to free up space.  
Use monitoring tools to track the performance of your backups. This will help you identify any potential bottlenecks and optimize the backup process.

I hope this helps you create smaller and quicker backups for your PostgreSQL database. Let me know if you have any questions or need further assistance!

Best regards,

Asad Ali


On Thu, Oct 17, 2024 at 12:38 AM Andy Hartman <hartman60home@xxxxxxxxx> wrote:
I am very new to Postgres and have always worked in the mssql world. I'm looking for suggestions on DB backups. I currently have a DB used to store Historical information that has images it's currently around 100gig. 

I'm looking to take a monthly backup as I archive a month of data at a time. I am looking for it to be compressed and have a machine that has multiple cpu's and ample memory.

Suggestions on things I can try ?  
I did a pg_dump using these parms
--format=t --blobs lobarch

it ran my device out of storage:

pg_dump: error: could not write to output file: No space left on device

I have 150gig free on my backup drive... can obviously add more

looking for the quickest and smallest backup file output...

Thanks again for help\suggestions 


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]

  Powered by Linux