On Saturday 01 August 2009 11:01:11 pm Eddie Drapkin wrote: > > I actually benchmarked that once. I had a reasonably large PHP file that > > was, in fact, over 50% docblocks. That's not even counting inline > > comments. While trying to find things to optimize, removing about 800 > > lines worth of comments (all of the docblocks) did, in fact, produce a > > noticeable performance difference. It was only barely noticeable, but it > > just barely registered as more than random sampling jitter. I actually > > concluded that if cutting the file *in half* was only just barely > > noticeable, then it really wasn't worth the effort. > > Yeah but what happens if you run the script through the tokenizer and > strip ALL comments, unnecessary whitespace, newline characters, etc. > out? Honestly? I think you'll save more CPU time by eliminating one SQL query. Most files are not 60% comments. In a file that is only about 20% comments, I doubt you could even measure the difference. There are far far far more useful ways to optimize your code. (Note that this is different for CSS or Javascript, where compressors like that are commonplace because you have to transfer the entire file over the network repeatedly, which is a few orders of magnitude slower than system memory. Compressors and aggregators there make sense. PHP code never leaves the server, so those benefits don't exist.) -- Larry Garfield larry@xxxxxxxxxxxxxxxx -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php