Hello all, Firstly, I apologise if this is not the correct list for this subject. Lately, I've been working on a data conversion, importing into Postgres using Copy From. The text file I'm copying from is produced from an ancient program and produces either a tab or semi-colon delimited file. One file contains about 1.8M rows and has a 'comments' column. The exporting program, which I am forced to use, does not surround this column with quotes and this column contains cr/lf characters, which I must deal with (and have dealt with) before I can import the file via Copy. Hence to my suggestion: I was envisioning a parameter DELIMITER_COUNT which, if one was 100% confident that all columns are accounted for in the input file, could be used to alleviate the need to deal with cr/lf's in varchar and text columns. i.e., if copy loaded a line with fewer delimiters than delimiter_count, the next line from the text file would be read and the assignment of columns would continue for the current row/column. Just curious as to the thoughts out there. Thanks to all for this excellent product, and a merry Christmas/holiday period to all. Mark Watson -- Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-general