Hi!
Thanks for all of your interest!
My test PC is Win7 (64-bit), and equipped with 8GB of memory.
In this java project, I configured VM option as:
-D
java.security.policy=applet.policy -Xms1280m -Xmx1536m.
And anything needed in the project is in the server descripted above.
I insert
By the way, my project is about migrating Oracle data of BLOB type to
PostgreSQL database. The out of memory error occurred
between migrating
Oracle BLOB to PostgreSQL bytea. Another question, if I can't migrate BLOB
to bytea,
how about oid type ?
If anybody know about this problem, please write to me.
Thanks in advance ! Liu Yuanyuan
Aug 7, 2013
liuyuanyuan
From: Chris
Travers
Date: 2013-08-07 00:49
To: Tomas Vondra
CC: liuyuanyuan;
pgsql-general
Subject: Re: inserting huge file into bytea cause out
of memory On Tue, Aug 6, 2013 at 7:04 AM, Tomas Vondra <tv@xxxxxxxx> wrote:
Hi, I have noticed a number of bytea/memory issues. This looks like Java,
and I am less familiar with that but there are some things that occur to me.
There are a few things that make me relatively suspicious of using byteas
where the file size is big (lobs are more graceful in those areas IMO because of
the fact that you can do seeking and chunking).
On the client side a lot of the difficulties tend to have to do with
escaping and unescaping. While I have not done a lot with Java in this area, I
have found that Perl drivers sometimes use up to 10x the memory to process the
file as the file would take up in binary format. I suspect this has to do
with copying the data, escaping it, and passing it on through. For small
files this is not an issue but if you are passing 2GB of data in, you had better
have a LOT of memory. I wouldn't be surprised if it were similar in
Java.
Now, if the front end and back end are on the same server, front-end memory
usage is going to count against you. Consequently you are going to have at
least the following memory counting against you:
1. The file in binary form
2. The file in escaped form
3. The file in escaped form on the back-end
4. The file in binary form on the back-end.
If hex escaping effectively doubles the size that gives you 6x the memory
just for that data. If it is getting copied elsewhere for intermediary
usage, it could be significantly more.
So I would start actually by looking at memory utilization on your machine
(front and back-end processes if on the same machine!) and see what is going
on. Best Wishes,
Chris Travers
Efficito: Hosted Accounting and ERP. Robust and Flexible.
No vendor lock-in.
|