Hi All. I basically have 3 tables. One being the core table and the other 2 depend on the 1st. I have the requirement to add upto 70000 records in all the tables. I do have constraints (primary & foreign keys, index, unique etc) set
for the tables. I can't go for bulk import (using COPY command) as there is no standard .csv file in requirement, and the mapping is explicitly required plus few validations are externally applied in a C based programming file. Each record details (upto 70000)
will be passed from .pgc (an ECPG based C Programming file) to postgresql file. It takes less time for the 1st few records and the performance is turning bad to the latter records! The result is very sad that it takes days to cover upto 20000! What are the
performance measures could I step in into this? Please guide me. ::DISCLAIMER:: ----------------------------------------------------------------------------------------------------------------------- The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only. It shall not attach any liability on the originator or HCL or its affiliates. Any views or opinions presented in this email are solely those of the author and may not necessarily reflect the opinions of HCL or its affiliates. Any form of reproduction, dissemination, copying, disclosure, modification, distribution and / or publication of this message without the prior written consent of the author of this e-mail is strictly prohibited. If you have received this email in error please delete it and notify the sender immediately. Before opening any mail and attachments please check them for viruses and defect. ----------------------------------------------------------------------------------------------------------------------- |