Dave Goodchild wrote:
Hi all. I am writing a web app with a mysql back end and there is every chance one of the tables may have to handle 56+ million records. I am no mysql expert but has anyone else here ever handled that volume of data, and if so, any suggestions or caveats? The tables will of course be correctly indexed and the database normalised.
There's no reason why it can't but the mysql list will be better to ask (because they could tell you what to do.. eg how much memory, what disks you should look to get etc to get decent performance).
Even with indexes and normalized data (which in some cases makes performance worse with all the joins you have to do) you'll need to tweak your server / settings to get something resembling "reasonable" performance.
-- Postgresql & php tutorials http://www.designmagick.com/ -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php