On 10/16/24 21:37, Andy Hartman wrote: > I am very new to Postgres and have always worked in the mssql world. I'm > looking for suggestions on DB backups. I currently have a DB used to > store Historical information that has images it's currently around 100gig. > > I'm looking to take a monthly backup as I archive a month of data at a > time. I am looking for it to be compressed and have a machine that has > multiple cpu's and ample memory. > > Suggestions on things I can try ? > I did a pg_dump using these parms > --format=t --blobs lobarch > > it ran my device out of storage: > > pg_dump: error: could not write to output file: No space left on device > > I have 150gig free on my backup drive... can obviously add more > > looking for the quickest and smallest backup file output... > > Thanks again for help\suggestions > You didn't specify the Postgres version - that matters, because older pg_dump versions (before PG 16) do not support compression. Since PG 16 you can use either -Fc or -Fd (instead of the tar format), and it'll compress the output using gzip. Alternatively, you can use --compress=method:level (the supported methods depend on how the packages were built, no idea what platform you're on etc.). See https://www.postgresql.org/docs/current/app-pgdump.html If you're on older version, you should be able to write the dump to standard output, and compress that way. Something like pg_dump -Fc | gzip -c > compressed.dump.gz However, be aware that pg_dump is more an export tool than a backup suitable for large databases / quick recovery. It won't allow doing PITR and similar stuff. regards -- Tomas Vondra