Search Postgresql Archives

Re: Optimizing large data loads

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Richard Huxton said:
> You don't say what the limitations of Hibernate are. Usually you might
> look to:
> 1. Use COPY not INSERTs

Not an option, unfortunately.

> 2. If not, block INSERTS into BEGIN/COMMIT transactions of say 100-1000

We're using 50/commit...we can easily up this I suppose.

> 3. Turn fsync off

Done.

> 4. DROP/RESTORE constraints/triggers/indexes while you load your data

Hmmm...will have to think about this a bit...not a bad idea but not sure
how we can make it work in our situation.

> 5. Increase sort_mem/work_mem in your postgresql.conf when recreating
> indexes etc.
> 6. Use multiple processes to make sure the I/O is maxed out.

5. falls in line with 4.  6. is definitely doable.

Thanks for the suggestions!

John


---------------------------(end of broadcast)---------------------------
TIP 6: explain analyze is your friend

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Postgresql Jobs]     [Postgresql Admin]     [Postgresql Performance]     [Linux Clusters]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Postgresql & PHP]     [Yosemite]
  Powered by Linux