Re[2]: Optimising LIMITs - alter table order by...

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hello Pavel,

Tuesday, March 2, 2004, 12:33:17 PM, you wrote:

PL> http://www.mysql.com/documentation/mysql/bychapter/manual_SQL_Syntax.html#ALTER_TABLE
PL> ORDER BY allows you to create the new table with the rows in a specific
PL> order. Note that the table will not remain in this order after inserts
PL> and deletes. In some cases, it might make sorting easier for MySQL if

Figured as much, thanks for the reference. I hope one day MySQL will
produce a manual that isn't one horrendous long document per section.

PL> You need to split the table and move "post messages" to another table if
PL> you want to get higher perfomance for (2).

With regard to disk seeking, here is my table structure:

CREATE TABLE `thread` (
  `threadid` int(10) unsigned NOT NULL auto_increment,
  `site_user_id` varchar(32) NOT NULL default '',
  `boardid` tinyint(3) unsigned NOT NULL default '0',
  `subject` varchar(200) NOT NULL default '',
  `modified` timestamp(14) NOT NULL,
  `created` timestamp(14) NOT NULL,
  `status` enum('L','P','H','D') NOT NULL default 'L',

  PRIMARY KEY  (`threadid`),
  FULLTEXT KEY `subject` (`subject`),
  KEY `boardid` (`boardid`),
  KEY `site_user_id` (`site_user_id`),
  KEY `created` (`created`),
  KEY `status` (`status`)
) TYPE=MyISAM;

As well as moving the subject field to another table (because it's the
only non-defined length field), would another way of speeding up the
disk seek be to turn it from a varchar(200) into a char(200)? I know
it means a larger table size, but MySQL should then be able to
calculate exactly where to jump to in the file?

PL> All MySQL functions should be very optimized, but there may be some
PL> exceptions. BENCHMARK(loop_count,expression) is a great tool to find out
PL> if this is a problem with your query.

I will look at this now.

I had been toying with the idea of creating a cache table that held
the 200 most recent threads, pre-sequenced so I just bring back that
instead of having to query the database unless they go beyond that 200
limit.

Your comments (and MySQL manual posts) have been very useful, thank
you.

-- 
Best regards,
 Richard Davey
 http://www.phpcommunity.org/wiki/296.html

-- 
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [PHP Users]     [Postgresql Discussion]     [Kernel Newbies]     [Postgresql]     [Yosemite News]

  Powered by Linux