On 05/26/2017 05:07 AM, doganmeh wrote:
I am piggy-backing in this thread because I have the same issue as well. I
need to import a csv file that is 672 columns long and each column consists
of 12 alpha-numeric characters. Such as:
SA03ARE1015D SA03ARE1S15N SB03ARE1015D ...
356412 275812 43106 ...
I am aware this is not normalized, however, we (or try to) keep source data
intact, and normalize after importing into our system.
While trying to import all columns to type `text` I get this error:
[54000] ERROR: row is too big: size 8760, maximum size 8160
Where: COPY temp_table, line 3
SQL statement "copy temp_table from
'/home/edgleweb/data/raw/TX/TAPR/2015/ADV/SSTAAR1ADV.csv' with delimiter ','
quote '"' csv "
I tried varchar(12) also, nothing changed. My questions is 1) I have
672x12=8,064 characters in the first row (which are actually the headers),
why would it complain that it is 8760. I am assuming here type `text`
occupies 1 byte for a character. 2) Is there anything I can do to work
https://www.postgresql.org/docs/9.6/static/datatype-character.html
"The storage requirement for a short string (up to 126 bytes) is 1 byte
plus the actual string, which includes the space padding in the case of
character."
around this situation?
Use csvkit's csvcut tool to split the file?:
http://csvkit.readthedocs.io/en/1.0.2/scripts/csvcut.html
Thanks in advance.
--
View this message in context: http://www.postgresql-archive.org/COPY-row-is-too-big-tp5936997p5963385.html
Sent from the PostgreSQL - general mailing list archive at Nabble.com.
--
Adrian Klaver
adrian.klaver@xxxxxxxxxxx
--
Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general