I am piggy-backing in this thread because I have the same issue as well. I need to import a csv file that is 672 columns long and each column consists of 12 alpha-numeric characters. Such as: SA03ARE1015D SA03ARE1S15N SB03ARE1015D ... 356412 275812 43106 ... I am aware this is not normalized, however, we (or try to) keep source data intact, and normalize after importing into our system. While trying to import all columns to type `text` I get this error: [54000] ERROR: row is too big: size 8760, maximum size 8160 Where: COPY temp_table, line 3 SQL statement "copy temp_table from '/home/edgleweb/data/raw/TX/TAPR/2015/ADV/SSTAAR1ADV.csv' with delimiter ',' quote '"' csv " I tried varchar(12) also, nothing changed. My questions is 1) I have 672x12=8,064 characters in the first row (which are actually the headers), why would it complain that it is 8760. I am assuming here type `text` occupies 1 byte for a character. 2) Is there anything I can do to work around this situation? Thanks in advance. -- View this message in context: http://www.postgresql-archive.org/COPY-row-is-too-big-tp5936997p5963385.html Sent from the PostgreSQL - general mailing list archive at Nabble.com. -- Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-general