Search Postgresql Archives

Re: Fastest way to duplicate a quite large database

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 04/12/2016 07:51 AM, Edson Richter wrote:
Same machine, same cluster - just different database name.

Hmm, running tests against the same cluster you are running the production database would seem to be a performance hit against the production database and potentially dangerous should the tests trip a bug that crashes the server.


Atenciosamente,

Edson Carlos Ericksson Richter

Em 12/04/2016 11:46, John R Pierce escreveu:
On 4/12/2016 7:25 AM, Edson Richter wrote:

I have a database "Customer" with about 60Gb of data.
I know I can backup and restore, but this seems too slow.

Is there any other option to duplicate this database as
"CustomerTest" as fast as possible (even fastar than backup/restore)
- better if in one operation (something like "copy database A to B")?
I would like to run this everyday, overnight, with minimal impact to
prepare a test environment based on production data.


copy to the same machine, or copy to a different test server?
different answers.








--
Adrian Klaver
adrian.klaver@xxxxxxxxxxx


--
Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general



[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux