Search Postgresql Archives

Re: PostgreSQL DB in prod, test, debug

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Wednesday, 14 February 2024 at 10:38, Erik Wienhold <ewie@xxxxxxxxx> wrote:

> 

> 

> On 2024-02-14 10:59 +0100, Simon Connah wrote:
> 

> > This project uses Python 3.12, Flask, psycopg3 and PostgreSQL 15.
> > 

> > This is probably a stupid question so I apologies in advance.
> > 

> > I'm building a website using PostgreSQL and since I've just been doing
> > some dev work on it I've just manually played around with the database
> > if I needed new tables or functions for example but I want to start
> > doing automated testing and need to import a clean snapshot of the
> > database with no data and then use the automated tests to test if
> > things work with the tests.
> > 

> > What I think is the best way to do this is to do a pg_dump of the
> > database (using the --schema-only flag)
> 

> 

> You create a dump from the prod database each time? Yikes.

Sorry. That came out wrong. I don't take a dump each time I run tests but I will generally take a full dump when working on something specific to the database.

> 

> > and then load it into a test only database that gets created at the
> > start of the unit tests and destroyed at the end. The automated tests
> > will insert, update, delete and select data to test if it all still
> > works.
> > 

> > My main question is does this sound OK? And if so is there a nice way
> > to automate the dump / restore in Python?
> 

> 

> The database schema should be defined by migration scripts that you also
> check into version control with the rest of your application sources.
> Some people also prefer a separate repository just for the database
> schema, depending on how tightly coupled database and application are,
> or if there are other applications relying on the database schema.
> 

> I use Sqitch[1] which works well if you want to track an existing
> database schema. Alembic is popular in Python but I don't know how it
> works with an existing schema because Alembic migrations are usually not
> written in plain SQL.
> 

> This is also useful for automated deployment because it allows you to
> migrate the database to a specific schema version that is necessary for
> your application.
> 

> For testing, I have a Bash script that starts a Docker container with
> Postgres and then runs sqitch-deploy before running pytest. That can of
> course be adapted to spin up a local Postgres instance instead. I also
> use pgTAP[2] with that to just test the database schema.
> 

> You can also use testcontainers[3] to start/stop a Docker container in
> pytest and run sqitch with subprocess before running all tests.
> 

> [1] https://sqitch.org/
> [2] https://pgtap.org/
> [3] https://testcontainers-python.readthedocs.io/en/latest/postgres/README.html
> 

> --
> Erik

Thank you. Squitch looks like a useful tool. I'll certainly look into all of that.

I've never really used Docker before. Instead I just spin up Vultr instances but I can read about Docker as well.

Simon.

Attachment: signature.asc
Description: OpenPGP digital signature


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]

  Powered by Linux