Hi all, I read that pg_dump can run while the database is being used and makes "consistent backups". I have a huge and *heavy* selected, inserted and updated database. Currently I have a cron task that disconnect the database users, make a backup using pg_dump and put the database online again. The problem is, now there are too much information and everyday the database store more and more data, the backup process needs more and more time to run and I am thinking about to do the backup using a process that let me to do it with the minimal interruptions for the users. I do not need a last second backup. I could the a backup with "almost all" the data but I need the information on it to be coherent. For example, if the backup store information about an invoice it *must* to store both header and items invoice information. I could live if the backup does not store some invoices information when is ran, because they ll be backuped the next time the backup process run. But I can not store only a part of the invoices. That is I call a coherent backup. The best for me is that the cron tab does a concurrent backup with all the information until the time it starts to run while the clients are using the database. Example: if the cron launch the backup process at 12:30 AM, the backup moust be builded with all the information *until* 12:30AM. So if I need to restore it I get a database coherent with the same information like it was at 12:30AM. it does not matter if the process needs 4 hours to run. Does the pg_dump create this kind of "consistent backups"? Or do I need to do the backups using another program? Regards Pablo ---------------------------(end of broadcast)--------------------------- TIP 9: In versions below 8.0, the planner will ignore your desire to choose an index scan if your joining column's datatypes do not match