I have database with a huge amount of data so i'm trying to make it as fast as possible and minimize space.
One thing i've done is join on a prepopulated date lookup table to prevent a bunch of rows with duplicate date columns. Without this I'd have about 2500 rows per hour with the exact same date w. timestamp in them.
My question is, with postgres do I really gain anything by this, or should I just use the date w. timestamp column on the primary table and ditch the join on the date_id table.
Primary table is all integers like:
date id | num1 | num2 | num3 | num4 | num5 | num6 | num7 | num 8
-------------------------------------------------------------------------------------------------
primary key is on date to num->6 columns
date_id lookup table:
This table is prepopulated with the date values that will be used.
date_id | date w timestamp
----------------------------------------
1 | 2007-2-15 Midnight
2 | 2007-2-15 1 am
3 | 2007-2-15 2 am etc for 24 hours each day
Each day 60k records are added to a monthly table structured as above, about 2500 per hour.
thank you for your advice
It's here! Your new message!
Get new email alerts with the free Yahoo! Toolbar.