On 2011-10-24, at 8:50 PM, Jason Pruim <lists@xxxxxxxxxxxxxxxxxxxx> wrote: > Now that I've managed to list 3 separate programming languages and somewhat tie it back into php here's the question... > > I have about 89 million records in mysql... the initial load of the page takes 2 to 3 minutes, I am using pagination, so I have LIMIT's on the SQL query's... But they just aren't going fast enough... > > What I would like to do, is pull the data out of MySQL and store it in the HTML files, and then update the HTML files once a day/week/month... I can figure most of it out... BUT... How do I automatically link to the individual pages? > > I have the site working when you pull it from MySQL... Just the load time sucks... Any suggestions on where I can pull some more info from? :) > > Thanks in advance! > > > Jason Pruim > lists@xxxxxxxxxxxxxxxxxxxx > > > > > -- > PHP General Mailing List (http://www.php.net/) > To unsubscribe, visit: http://www.php.net/unsub.php > That's a ton of data. So there are a couple of questions: 1. Is the data ordered in any way? The issue is that you might need to regenerate the files if new data needs to be interspersed. 2. Why is pagination slow? A LIMIT with an OFFSET should be very quick if the table is properly indexed. Is there any tuning that you can do to further filter the results? Say by date or some other criteria? Have you run an EXPLAIN plan on the db to show how the queries are being run? Other thoughts: - are your indexes up to date? Have you optimized those index to be clean, not fragmented, and therefore fast? - can you organize the data by some criteria to better show the results? Like by date? Or by starting alpha character? Or by category? Bastien Koert 905-904-0334 -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php