Search Postgresql Archives

Re: Replicating hundreds of thousandw of rows

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 25 November 2016 at 06:23, Job <Job@xxxxxxxxxxxxxxxxxxxx> wrote:
> Hello,
>
> we need to replicate hundreds of thousands of rows (for reporting) between Postgresql Database nodes that are in different locations.
>
> Actually, we use Rubyrep with Postgresql 8.4.22.

8.4 is now end-of-life. You should move to the latest version.

> It works fine but it is very slow with a massive numbers of rows.
>
> With Postgresql 9.x, are there some ways to replicate (in background, not in real time!), these quantities of data?
> We need a periodical syncronization..,

You have a choice of

* Physical streaming replication, built-in from 9.0+
* Logical streaming replication, partially built in from 9.4+ using pglogical
and
* Logical streaming replication, built in from 10.0+ (not yet released)

Performance is much better than rubyrep

-- 
Simon Riggs                http://www.2ndQuadrant.com/
PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services


-- 
Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general



[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux