Hello Guys, We are trying to migrate from Oracle to Postgres. One of the major requirement of our database is the ability to generate XML feeds and some of our XML files are in the order of 500MB+.
We are getting "Out of Memory" errors when doing an update on a table.
Here is some detail on the error: ------------------------------------ update test_text3 set test=test||test The table test_text3 contains only one record, the column test contains a string containing 382,637,520 characters (around 300+ MB)
Error Message: ERROR: out of memory DETAIL: Failed on request of size 765275088. The server has 3GB of RAM: total used free shared buffers cached Mem: 3115804 823524 2292280 0 102488 664224 -/+ buffers/cache: 56812 3058992 Swap: 5177336 33812 5143524 I tweaked the memory parameters of the server a bit to the following values, but still no luck. shared_buffers = 768MB effective_cache_size = 2048MB checkpoint_segments 8 checkpoint_completion_target 0.8 work_mem 10MB max_connections 50 wal_buffers 128 This error is consistent and reproducible every time I run that update. I can provide a detailed stack trace if needed. Any help would be highly appreciated. For those who are interested in the background, we are trying to migrate from Oracle to Postgresql. One of the major requirement of our database is the ability to generate XML feeds and some of our XML files are in the order of 500MB+.
Considering future scalability we are trying to see how much data can be stored in a "text" column and written to the file system as we found PostgreSQL's COPY command a very efficient way of writing date to a file.
Thanks in advance and best regards, Zeeshan |