If I was in a hurry to implement this, and I had a userbase that wasn't very experienced with managing relational databases, I'd write some code to automatically and periodically build a docker image with the latest data in it (however often is sufficient to meet your needs), and then I'd set up a 1-line scheduled command on the laptops that would pull the latest docker image to the user's laptop. Then I'd give them a script that runs the docker container locally, and give them a client that knows how to connect to it. Assuming it is a read-only db when you aren't connected, I could automate all of that in just a few hours in most environments, and the changes that would be required on the individual laptops would be minimal.
If you need to be able to write to the db when disconnected, and pull those writes into the central db instance when connected, that's a tougher problem to solve which is more suited to some of the earlier suggestions. But if you only need to read when remote and just want something that works, is easy to put together, and can likely be built by an outside consultant for minimal expense and even less ongoing support and maintenance, I would just pay someone to read rthat first paragraph and set it up for me and call it good. Any of the suggested solutions is going to require a fair amount of administrative competence to really put together, so going for one that shouldn't require much maintenance to keep synchronized is your best bet.
On Thu, Sep 5, 2019 at 3:43 PM Adrian Klaver <adrian.klaver@xxxxxxxxxxx> wrote:
On 9/5/19 2:00 PM, Judith Lacoste wrote:
> Hi,
>
> I think PostgreSQL is the solution for my needs, but I am not a
> programmer/coder. If I can confirm PostgreSQL does what I need, I will
> have to hire someone to assist, I am willing to give the effort to learn
> myself but it may be difficult, my specialities are biology and
> microscopy. Or perhaps the use of PostgreSQL is restricted to people
> highly trained in computer sciences?
No, I am biologist and I learned Postgres/database management. It is
about organizing things and that is a commonality with biology.
> I have been looking around a lot through the PostgreSQL website,
> searching the archives, and I even contacted PostgreSQL people locally
> but I still don’t have a clear answer to my first question. So I am
> posting it here with the hope to move on with PostgreSQL, or abandon the
> project.
This would be the list to talk to.
> I plan to install the database on a server in the office. Me and my four
> colleagues will occasionally connect to this database when we are
> working in other locations (usually hospitals or universities). In such
> remote locations, we often do not have internet/network, yet we still
> need to access the database. Currently, we use a system where a copy of
> the database lives on each of our laptops. We can access all the
> information in the database despite being offline. This local copy of
> the database is synchronized with the server once network becomes
> available again.
> question is whether or not such set up is possible with PostgreSQL?
The set up is possible, though how you would implement it would depend
on several factors:
1) What OS and versions are you using?
2) Are you working directly with the database or through an application?
3) What programming languages are you using?
There is also the option of using Sqlite(https://sqlite.org/index.html)
for your 'local' databases and then syncing them to Postgres.
>
> Why am I interested in PostrgreSQL? First, my work has made me aware of
> how precious open source tools are. Our main tools for data analysis
> are open source. Commercial equivalents are black boxes which we try to
> avoid in the name of science reproducibility and transparency.
> Secondly, the commercial software we are currently using is apparently
> based on PostgreSQL, so I am hoping that using PostgreSQL will make
> migration less painful.
>
> Thank you in advance,
>
> Judith
--
Adrian Klaver
adrian.klaver@xxxxxxxxxxx