Search Postgresql Archives

Multi master use case?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hello,

A client of ours has always had problems with slow internet connectivity - they are in a part of the country where that is a problem. There are a few hundred staff sharing a couple of asymmetric (ADSL) connections. One issue is with accessing their web-based Postgres app, which we host. Now they don't want to run it internally for a lot of the usual reasons, not least they have many distributed workers and trying to serve data from an already congested spot would be a non starter.

Is this a case for multi master do you think? I.e. running one on the internet, one locally.

Looking through the wiki

http://wiki.postgresql.org/wiki/Replication,_Clustering,_and_Connection_Pooling

it seems there are a few solutions that have now gained maturity. Something like rubyrep sounds ideal. It would have to deal with
a) a flaky local connection
b) changing schemas (new tables, fields, views etc.) as well as data

Create/update/delete frequencies are reasonably low, generally individuals updating single records so of the order of thousands per day max.

Any experiences/thoughts?

Oliver Kohll
www.gtwm.co.uk
-- 
Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general



[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux