Re: PostgreSQL equivalents to dbms_metadata.get_ddl & dbms_metadata.get_dependent_ddl

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 





On Mon, Mar 20, 2023 at 10:35 PM Ron Johnson <ronljohnsonjr@xxxxxxxxx> wrote:
Did you try  https://github.com/MichaelDBA/pg_get_tabledef?

On Mon, Mar 20, 2023 at 12:39 PM richard coleman <rcoleman.ascentgl@xxxxxxxxx> wrote:
Thomas, 
Thanks for letting me know about this product.  Looking at their page doesn't fill me with a sense of comfort.  There's too little information, nothing clear about on prem. vs in cloud, no pricing information, etc.  Just a lot of marketing fluff and directions to contact their sales people.

Of course, this doesn't address the fundamental concern which is that currently PostgreSQL makes it needlessly difficult to get basic database information, in this case DDL via SQL.  Liquibase, like pg_dump, is yet another utility needed to fill in for this gap in PostgreSQL native functionality.

I've resigned myself to having assemble the information I need by hand.  I can only hope that the PostgreSQL developers see their way to including it in future versions.

Thanks again, 
rik.

On Mon, Mar 20, 2023 at 12:10 PM Thomas Kellerer <shammat@xxxxxxx> wrote:
richard coleman schrieb am 20.03.2023 um 16:48:
> My use case is copying and maintaining the same tables in Oracle and
> PostgreSQL as well as detecting and applying DDL changes between the
> two systems.
We are using Liquibase as a schema migration tool which allows us to
specify a single "changelog" (using their XML format) that can be
applied to Oracle and Postgres alike.

So we never need to "compare" definitions, we just run the changelog
script to update the database schema.

Some differences in e.g. available data types can be dealt with using
Liquibase's properties or if everything else fails using different
changesets for Postgres and Oracle (which is typically only needed
for views and procedural code).


Of course it might not be feasible to re-structure your existing
infrastructure for schema migrations to use a completely different
approach.

Thomas



Yech, I tried.
I use the following table for a test.  

CREATE TABLE test_table_def(
    a text COLLATE "german_phonebook"
    ,b bigint generated by default as identity primary key
    WITH (fillfactor=70)
    ,c int  default 7 check (c > 1)
    ,d int references test0(test0_d) on delete cascade on update cascade
    ,e text default 'hello'
        ,UNIQUE(e) WITH (fillfactor=70)
    ,f text generated always AS(
        case
            when c < 10 then 'within10'
            when c < 20 and c >= 10 then '10 to 20'
            when c >= 20  then 'largerthan20'
        end
    ) STORED
)WITH (fillfactor=70);

--------
for now, pg_get_tabledef lacks generation_expression, and index storage parameter info.
Since tables are related to so many objects, using pg_dump is the right choice.



[Index of Archives]     [Postgresql Home]     [Postgresql General]     [Postgresql Performance]     [Postgresql PHP]     [Postgresql Jobs]     [PHP Users]     [PHP Databases]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Databases]     [Yosemite Forum]

  Powered by Linux