I started doing the following to define my primary keys long ago and in a universe far away: CREATE TABLE employee ( employee_key integer DEFAULT nextval('employee_key_serial') PRIMARY KEY , WEE ran into a scenario, after a total db restore on a project where we got errors inserting new data because the keys were duplicated. Looking at a pg_dump, it appears to me that I now understand why. Although the values f the keys, and the data structures that reference them look like they will get restored correctly, it appears to me that the sequences get recreated with an initial value of 1, which means that on the next insert we will get 1 for a key, which likely is already used. Looks like this is a different way of defining this: CREATE TABLE books ( id SERIAL PRIMARY KEY, Which has the advantage of not having to manually create the sequences. Will this also enforce that the "internally created sequence" will be initialized to a value above the maximum key in use on a pg_restore? -- "They that would give up essential liberty for temporary safety deserve neither liberty nor safety." -- Benjamin Franklin