On 10/22/19 2:47 PM, stan wrote:
I started doing the following to define my primary keys long ago and in a universe far away: CREATE TABLE employee ( employee_key integer DEFAULT nextval('employee_key_serial') PRIMARY KEY , WEE ran into a scenario, after a total db restore on a project where we got errors inserting new data because the keys were duplicated. Looking at a pg_dump, it appears to me that I now understand why. Although the values f the keys, and the data structures that reference them look like they will get restored correctly, it appears to me that the sequences get recreated with an initial value of 1, which means that on the next insert we will get 1 for a key, which likely is already used. Looks like this is a different way of defining this: CREATE TABLE books ( id SERIAL PRIMARY KEY, Which has the advantage of not having to manually create the sequences. Will this also enforce that the "internally created sequence" will be initialized to a value above the maximum key in use on a pg_restore?
When I restored an 8.4 database to 9.6 using pg_dump/pg_restore, all of the sequences were restored using the sequence values at the time the pg_dump ran.
-- Angular momentum makes the world go 'round.