On Monday, November 21, 2011 4:53:21 pm Yan Chunlu wrote: > and database will stop receiving the following data after detected an > error? > that means while using pg_restore, no error allowed to happen, otherwise > the database will stop receiving data and the import will fail. > > I found only one record in psql's log: > > duplicate key value violates unique constraint "account_pkey" > > does that means one duplicate record will prevent all other records to > import? For that table yes. Though if that table is the parent in FK relationships with other tables, those tables will fail to import also because the keys they refer to do not exist. To get around this you have several options: 1) Find the duplicate entry(s) in the original table and eliminate them before dumping. 2) Dump the table by itself to a plain text format and eliminate the duplicate(s) in the plain text file before restoring. 3) By default pg_dump uses COPY to load data into tables. As you have found out that runs as a single transaction and rollbacks if there is an error. You can specify --insert to the pg_dump command to get it to output INSERT(s) for each row. The up side is each INSERT is a separate transaction. The down side is if there is a lot of data it will take a long time to load because each INSERT is a separate transaction. 4) Use pgloader (http://pgfoundry.org/projects/pgloader/). It is a Python program that 'manages' COPY. It will kick out bad rows and keep loading data. -- Adrian Klaver adrian.klaver@xxxxxxxxx -- Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-general