Re: efficiency of include()

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Daniel Kolbo wrote:
> Larry Garfield wrote:
>> On Sunday 20 December 2009 10:45:45 am Daniel Kolbo wrote:
>>> Hello PHPers,
>>>
>>> This is a two part question:
>>>
>>> 1) Is it faster to include one file with lots of code, or many separate
>>> smaller individual files?  Assume the one massive file is merely the
>>> concatenation of all the smaller individual files.  (I am assuming the
>>> one massive file would be faster..., but i wanted to get confirmation).
>> Conventional wisdom is that the one big file is faster, since it requires one 
>> disk I/O hit instead of several.  HOWEVER, if you're only using a small 
>> portion of that code then it could be faster to load only the code you really 
>> need.  Where the trade off is varies with your architecture, the amount of 
>> code, ad how good the disk caching of your OS is.
>>
>>> 2) Suppose php has to invoke the include function 100 times.  Suppose
>>> all files are on average the same size and contain the same number of
>>> instructions.  Would it be faster to include the same exact file 100
>>> times as opposed to 100 different file names?  Basically, does the
>>> engine/parser take any shortcuts if it notices that the file name has
>>> already been included once?
>> I'm pretty sure that PHP will recognize that it's already parsed that file and 
>> keep the opcode caches in memory, so it needn't hit disk again.  I've not 
>> checked into that part of the engine, though, so I may be wrong there.
>>
> 
> Thanks for the reply.
> 
> For 2): I've often searched for php parsing documentation.  I love the
> php.net documentation.  However, i have yet to find an excellent source
> documenting the php parser/engine.  My searches always yield the zend
> website, but it doesn't seem like i can get very far from that page.
> Any suggestions on where i could learn more of the nitty gritty details
> of the php/zend behaviours?
> 
> Thanks,
> dK
> `

Daniel,

I'm only replying because I've been down this route of taking everything
in to consideration countless times now.

Code optimisation, sql and db optimisation, database and web server
tuning all have a huge impact in comparison to the things your
considering, and as such should probably be given more weight.

Further, the next obvious steps are to get zend optimizer (which
optimizes the opcodes) then a good opcode cache; and finally cache all
the output you can so that php doesn't even have to come in to the
equation for most "hits".

Then, finally you get down to the bits you're considering for those
extra microseconds and the knowledge that you've done good; whether
it'll make any difference at this point or not is another issue :p

bit of light reading for you:
http://phplens.com/lens/php-book/optimizing-debugging-php.php

regards!

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux