On 01/05/2017 11:46 AM, Adrian Klaver wrote:
On 01/05/2017 08:31 AM, Rob Sargent wrote:
On 01/05/2017 05:44 AM, vod vos wrote:
I finally figured it out as follows:
1. modified the corresponding data type of the columns to the csv file
2. if null values existed, defined the data type to varchar. The null
values cause problem too.
so 1100 culumns work well now.
This problem wasted me three days. I have lots of csv data to COPY.
Yes, you cost yourself a lot of time by not showing the original table
definition into which you were trying insert data.
Given that the table had 1100 columns I am not sure I wanted to see it:)
Still the OP did give it to us in description:
https://www.postgresql.org/message-id/15969913dd3.ea2ff58529997.7460368287916683127%40zoho.com
"I create a table with 1100 columns with data type of varchar, and
hope the COPY command will auto transfer the csv data that contains
some character and date, most of which are numeric."
In retrospect I should have pressed for was a more complete
description of the data. I underestimated this description:
"And some the values in the csv file contain nulls, do this null
values matter? "
My apologies for missing that. Was sure there would be room for some
normalization but so be it: OP's happy, I'm happy
--
Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general