Re: Optimization required for multiple insertions in PostgreSQL

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 3 Listopad 2011, 16:52, siva palanisamy wrote:
> I basically have 3 tables. One being the core table and the other 2 depend
> on the 1st. I have the requirement to add upto 70000 records in the
> tables.
> I do have constraints (primary & foreign keys, index, unique etc) set for
> the tables. I can't go for bulk import (using COPY command) as there is no
> standard .csv file in requirement, and the mapping is explicitly required
> plus few validations are externally applied in a C based programming file.
> Each record details (upto 70000) will be passed from .pgc (an ECPG based C
> Programming file) to postgresql file. It takes less time for the 1st few
> records and the performance is turning bad to the latter records! The
> result is very sad that it takes days to cover upto 20000! What are the
> performance measures could I step in into this? Please guide me

As Kevin already pointed out, this overall and very vague description is
not sufficient. We need to know at least this for starters

- version of PostgreSQL
- environment (what OS, what hardware - CPU, RAM, drives)
- basic PostgreSQL config values (shared buffers, checkpoint segments)
- structure of the tables, indexes etc.
- output of vmstat/iostat collected when the inserts are slow

And BTW the fact that you're not using a standard .csv file does not mean
you can't use COPY. You can either transform the file to CSV or create it
on the fly.

Tomas


-- 
Sent via pgsql-performance mailing list (pgsql-performance@xxxxxxxxxxxxxx)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-performance


[Postgresql General]     [Postgresql PHP]     [PHP Users]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Yosemite]

  Powered by Linux