Re: Configuration Advice

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



How many rows do you typically load each night?  If it is say less than 10%
of the total rows, then perhaps the suggestion in the next paragraph is
reasonable.

Hrm. It's very, very variable. I'd say it's more than 10% on average, and it can actually be pretty close to 50-100% on certain days. Our data is based upon customer submissions, and normally it's a daily basis kind of deal, but sometimes they'll resubmit their entire year on certain deadlines to make sure it's all in. Now, we don't have to optimize for those deadlines, just the 'average daily load'. It's okay if on those deadlines it takes forever, because that's understandable.

However, I will look into this and see if I can figure out this average value. This may be a valid idea, and I'll look some more at it.


Thanks!

Steve

SO ... our goal here is to make this load process take less time.  It
seems the big part is building the big summary table; this big summary
table is currently 9 million rows big.  Every night, we drop the table,
re-create it, build the 9 million rows of data (we use COPY to put hte
data in when it's prepared, not INSERT), and then build the indexes on it


Perhaps, placing a trigger on the source table and building a "change log"
would be useful.  For example, you could scan the change log (looking for
insert, update, and deletes) and integrate those changes into your summary
table.  Obviously if you are using complex aggregates it may not be possible
to adjust the summary table, but if you are performing simple SUM's,
COUNT's, etc.  then this is a workable solution.


--
Chad
http://www.postgresqlforums.com/



[Postgresql General]     [Postgresql PHP]     [PHP Users]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Yosemite]

  Powered by Linux