Fetching 10 rows is much faster than 1000 ! Fourat your code is optimized just keep it as it :) just keep your code away from adodb, pear db, and such abstraction if you want speed ! you don't need to talk about optimisation with 2 queries.
Regards, Hatem
Depends on the DB, in many cases the times are so similar as to not be worthwhile - but yes, I agree - limits are definitely worthwhile.
Run the query without the limit, this gives you the count - don't actually fetch the rows. Now run the same query, with the limit. If your database is worth anything (most any is), it has this query cached and it takes negligible extra time, and you don't have to spend time 'skipping' ahead X rows. If your database interface functions support 'skipping' ahead - use that instead.
Obviously, for page 1 of a paginated list, this performs worse than just running the single query. But if you get to page 99, you'll likely find this is faster. Feel free to do your own tests, many factors can change all of these findings, and it's best to match them to suit your own scenario.
cheers,
--
- Martin Norland, Sys Admin / Database / Web Developer, International Outreach x3257
The opinion(s) contained within this email do not necessarily represent those of St. Jude Children's Research Hospital.
-- PHP Database Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php