On Wed, Oct 26, 2011 at 4:14 AM, Lester Caine <lester@xxxxxxxxxxx> wrote: > Tommy Pham wrote: > >> >> Many of my customers have coming up on 20 years of data available. >> There has >> been a debate on transferring historic data to a separate database, but >> having it available is not causing a problem, except for some counts >> and >> larger search actions, and being able to see how long a client has been >> visiting is often useful. Statistical analysis is always done on a >> separate >> machine, with a replicated copy of the data, so as not to affect the >> active >> users ... >> >> >> What kind of counts/filters? What kind of RAID subsystem is the storage? >> What's the total size of the DB? Up to 20 years of data should be in the >> peta >> range. In that peta range, if you're not having performance issue and not >> using >> either RAID 0, 0+1, 10, 50, or 60, I'd love to hear about the application >> and >> database design in details. :) >> > > We are still only in hundreds on Mb and historic data is has less detail > than the current 'transactions'. The current postcode table is 500Mb, and > while the LLPG data would increase that by the order of 100, it's currently > only restricted to a councils immediate working area, so we keep the problem > contained. Dropping back to postcode for out of area enquiries. Users > complain if an enquiry takes more than a few seconds, and Firebird is giving > me more than adequate performance, and allows shadow data to be created via > triggers to reduce the need for 'counting'. > > I have a new 'application' which is using the same search criteria but the > data volume is growing a lot faster, 10Gb on the test system here, but I am > still seeing the same search speeds once the correct indexes have been > generated. But it will take a few more years before that starts reaching the > 100Gb level :) > > > -- > Lester Caine - G8HFL > ----------------------------- > Contact - http://lsces.co.uk/wiki/?page=**contact<http://lsces.co.uk/wiki/?page=contact> > L.S.Caine Electronic Services - http://lsces.co.uk > EnquirySolve - http://enquirysolve.com/ > Model Engineers Digital Workshop - http://medw.co.uk// > Firebird - http://www.firebirdsql.org/**index.php<http://www.firebirdsql.org/index.php> > > > I'm just curious. What's the total rows count? Data accumulated in 20 years and only taking that much space doesn't seem like there's a lot going on each year over the years. All the DBAs that I know they deal with minimum addition/import of 1 million rows per week, currently. I didn't bother asking them how far back they keep the data as that amount of rows is too overwhelming for me, for the moment, as I'm not a DBA :)