Search Postgresql Archives

Re: COPY FROM : out of memory

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Alvaro Herrera a écrit :
Arnaud Lesauvage wrote:
Martijn van Oosterhout a écrit :
>On Thu, Nov 23, 2006 at 11:27:06AM -0500, Tom Lane wrote:
>>Arnaud Lesauvage <thewild@xxxxxxxxxxx> writes:
>>> When trying to import a 20M rows csv file into PostgreSQL, I >>> get :
>>
>>> ERROR: out of memory
>>> État SQL :53200
>>> Détail :Failed on request of size 1073741823.
>>> Contexte : COPY tmp, line 1
>>
>>Can you put together a self-contained example?  The reference to "line
>>1" suggests that you wouldn't need the whole 20M row file, just the
>>first few rows ...
>
>Maybe it's a line termination problem?

I think you are right !
Trying to see the first line with sed outputs the whole file!
All I did was export the file in UNICODE from MSSQL, convert it with iconv -f "UCS-4-INTERNAL" -t "UTF-8" myfile.cvs.

I guess I still don't have the right encoding... :(

Did you set the encoding with \encoding?  I think it's critical for
determining line and field separators.  If you only do SET
client_encoding, the backend will work but psql may not.

Or you mean that the first line of the text file is the whole file?  In
that case I'd guess that the iconv procedure is borked somehow, or maybe
the input file is OK for everything except the linefeed(*)

No, I used "SET cleint_encoding".
But I checked the file with sed, and sed agrees with PostgreSQL : there is just one line in the file. I have a last idea. I'll give it a try today, if it doesn't work I'll forget about this COPY stuff and work through ODBC.

--
Arnaud


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux