Hello PHPers, This is a two part question: 1) Is it faster to include one file with lots of code, or many separate smaller individual files? Assume the one massive file is merely the concatenation of all the smaller individual files. (I am assuming the one massive file would be faster..., but i wanted to get confirmation). 2) Suppose php has to invoke the include function 100 times. Suppose all files are on average the same size and contain the same number of instructions. Would it be faster to include the same exact file 100 times as opposed to 100 different file names? Basically, does the engine/parser take any shortcuts if it notices that the file name has already been included once? I would test this, but i don't want to create hundreds of different files... Thanks, dK ` -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php