Search Postgresql Archives

Re: Generate test data inserts - 1GB

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 





On Fri, 9 Aug 2019, 21:25 Adrian Klaver, <adrian.klaver@xxxxxxxxxxx> wrote:
On 8/9/19 8:14 AM, Shital A wrote:
>

> Hello,

>
> 4) What techniques have you tried?
> Insert into with With statement, inserting 2000000 rows at a time. This
> takes 40 mins.
>

To add to my previous post. If you already have data in a Postgres
database then you could do:

pg_dump -d db -t some_table -a -f test_data.sql

That will dump the data only for the table in COPY format. Then you
could apply that to your test database(after TRUNCATE on table, assuming
you want to start fresh):

psql -d test_db -f test_data.sql




--
Adrian Klaver
adrian.klaver@xxxxxxxxxxx

Thanks for the reply Adrian. 

Missed one requirement. Will these methods generate wal logs needed for replication? 

Actually the data is to check if replication catches up. Below is scenario :

1. Have a master slave cluster with replication setup

2. Kill master so that standby takes over. We are using pacemaker for auto failure. 
Insert 1 GB data in new master while replication is broken.

3 Start oldnode as standby and check if 1GB data gets replicated.

As such testing might be frequent we needed to spend minimum time in generating data. 
Master slave are in same network. 

Thanks ! 

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]

  Powered by Linux