On Sunday 20 December 2009 10:45:45 am Daniel Kolbo wrote: > Hello PHPers, > > This is a two part question: > > 1) Is it faster to include one file with lots of code, or many separate > smaller individual files? Assume the one massive file is merely the > concatenation of all the smaller individual files. (I am assuming the > one massive file would be faster..., but i wanted to get confirmation). Conventional wisdom is that the one big file is faster, since it requires one disk I/O hit instead of several. HOWEVER, if you're only using a small portion of that code then it could be faster to load only the code you really need. Where the trade off is varies with your architecture, the amount of code, ad how good the disk caching of your OS is. > 2) Suppose php has to invoke the include function 100 times. Suppose > all files are on average the same size and contain the same number of > instructions. Would it be faster to include the same exact file 100 > times as opposed to 100 different file names? Basically, does the > engine/parser take any shortcuts if it notices that the file name has > already been included once? I'm pretty sure that PHP will recognize that it's already parsed that file and keep the opcode caches in memory, so it needn't hit disk again. I've not checked into that part of the engine, though, so I may be wrong there. -- Larry Garfield larry@xxxxxxxxxxxxxxxx -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php