Thanks Adrian , i think it works but still facing problem while loading Mysql ( MEDIUMBLOB ) column into PG ( BYTEA ) column :
test=# \encoding LATIN1;
test=# copy jobs from '/tmp/test.csv' with DELIMITER AS ',' QUOTE '"' NULL AS '\N' ESCAPE E'\\' CSV;
ERROR: invalid input syntax for type bytea
CONTEXT: COPY jobs, line 259, column wf_instance: "\0$0000000-120805203721153-oozie-oozi-W\0\0)<?xml version="1.0" encoding="UTF-8" standalone="no"?><..."
test=#
test=# \encoding LATIN1;
test=# copy jobs from '/tmp/test.csv' with DELIMITER AS ',' QUOTE '"' NULL AS '\N' ESCAPE E'\\' CSV;
ERROR: invalid input syntax for type bytea
CONTEXT: COPY jobs, line 259, column wf_instance: "\0$0000000-120805203721153-oozie-oozi-W\0\0)<?xml version="1.0" encoding="UTF-8" standalone="no"?><..."
test=#
wf_instance is single mediumblob column in the table.
On Thu, Jun 6, 2013 at 11:19 PM, Adrian Klaver <adrian.klaver@xxxxxxxxx> wrote:
On 06/06/2013 10:33 AM, Adarsh Sharma wrote:Probably because ASCII, Latin1 and UTF8 share the first 127 characters and your other tables did not have data that used characters above the first 127.
Hi,
Today i need to load some mysql ( 5.1.58 ) tables dump in postgresql (
PG 9.2 ). I loaded few tables successfully but while loading one table i
am facing below error :
test=# copy jobs from '/tmp/test.csv' with DELIMITER AS ',' QUOTE '"'
NULL AS '\N' ESCAPE E'\\' CSV;
ERROR: invalid byte sequence for encoding "UTF8": 0xc7 0x3c
CONTEXT: COPY jobs, line 259
After some research , i think it is failing because Mysql table has
character set latin1 format and PG9.2 has by deafult UTF-8 format. But
don't understand how other tables got successfully loaded.
Might try
Below link
tells to use iconv :
http://www.perlmonks.org/?node_id=942765 , but still the same error.
test=#\encoding LATIN1
and then your copy command
or
use the ENCODING parameter to COPY:
http://www.postgresql.org/docs/9.2/interactive/sql-copy.html--
My doubt is in this table there is one MEDIUMBLOB that i used as BYTEA
in postgresql. Is it was due to that or anyone fixed this issue earlier
, plz update.
Thanks
Adrian Klaver
adrian.klaver@xxxxxxxxx