I was trying to achieve smallest file possible so tried the xz. Right now the db size returned from SELECT pg_size_pretty(pg_database_size('postgres') is 1.4 GB and the size of the dump with xz is 2.2 GB.
Is there a limit to the size of the database that pg_dump will run on? Will it work when db is in TBs?
Also, I want to know if anyone has found any handy cron scripts for automated backups to run on a daily/weekly basis? i found some on google, but interested to know if there are better ones.
Thanks,
ap
On Wed, Oct 28, 2015 at 12:05 AM, Adrian Klaver <adrian.klaver@xxxxxxxxxxx> wrote:
On 10/27/2015 04:10 PM, anj patnaik wrote:
I am running pg_dump on a database while on another machine running a
loop to do insertions.
Does pg_dump wait for idle activity before it completes or how does it
determine when it has all the records needed for archiving?
http://www.postgresql.org/docs/9.4/interactive/app-pgdump.html
"pg_dump is a utility for backing up a PostgreSQL database. It makes consistent backups even if the database is being used concurrently. pg_dump does not block other users accessing the database (readers or writers).
I am using the compressed mode and also using xz.
Again, why?
Thanks,
ap
--
Adrian Klaver
adrian.klaver@xxxxxxxxxxx