liuyuanyuan
From: Michael
Paquier
Date: 2013-08-07 15:26
To: Albe
Laurenz
Subject: Re: inserting huge file into bytea cause out
of memory On Wed, Aug 7, 2013 at 3:56 PM, Albe Laurenz <laurenz.albe@xxxxxxxxxx> wrote:
> liuyuanyuan wrote:
>>> By the way, my project is about migrating Oracle data of BLOB type to
>>> PostgreSQL database. The out of memory error occurred between migrating
>>> Oracle BLOB to PostgreSQL bytea. Another question, if I can't migrate BLOB to bytea,
>>> how about oid type ?
>
Laurenz Albe wrote: >> Large Objects (I guess that's what you mean with "oid" here)
>> might be the better choice for you, particularly since you
> >have out of memory problems.
Michael wrote: >
Take care that the limit of large objects is 2GB in Postgres 9.2 or
>lower (with default block size).By thw way, you will be fine in the
>case of your application. It is also worth noticing that is increased
>to 4TB in 9.3.
Thanks for your last reply!
I've test Large Object ( oid
type ), and it seems better on out of memory.
But, for the out of memory problem of bytea,
we really have no idea to
solve it ? Why there's no way to solve it ? Is this a problem of
JDBC ,or the type itself ?
Yours,
Liu Yuanyuan |