Paul M Foster wrote:
On Fri, Apr 10, 2009 at 09:01:14AM -0400, Bob McConnell wrote:
From: Paul M Foster
Here's a hairbrained idea I was kicking around. I object to the idea
of
including 15 or 30 files in a PHP application just to display one page
on the internet. It makes the coding faster, but it makes the display
slower and seems silly to me.
So what if you had a tool or a set of tools where you could write code
snippets and such, and then hit a button or issue a command, and
everything you specified got written into a single file? You'd specify
that this page needs to read the config, set up a database connection,
validate these fields, etc. When you were done, it would write all
this
code to a *single* file, which the user would invoke by surfing to
that
page. The resulting code would be *static*, not like what results from
most templating systems. So rather than specify a specific variable
value in the resulting file, it would embed the PHP code to display
the
variable, etc.
What might be the liabilities of something like that? Would there be
security issues? Would there be increased difficulty in debugging?
What
can you think of?
Programs to do that used to be called compilers. There is an entire
branch of computer science and a lot of tools (lex, yacc, etc.)
dedicated to that topic.
I know compilers. I've coded in C for years. I'm not talking about a
compiler here. It's more an "aggregator". The resulting page would still
be php/html, but everything needed in it would be self-contained (except
the web server and PHP interpreter). Kind of like what "make" does,
except that make typically invokes the compiler to mash it all into one
executable at the end.
It's not a bad idea, but there is one precarious assumption that
underlies it. Can you absolutely guarantee there will never be a second,
or third, or more pages on that server that will need some of those
functions or classes? As soon as the site begins to evolve and grow, you
will have multiple copies of many of those snippets, and when (not if)
you need to modify them, you will have to find and change every single
copy.
So you need to ask yourself if this strategy is maintainable in your
case. And will it make any real difference in the end?
Good point. That's why I asked the question in the first place. Every
time you revised a supporting file, you'd have to regenerate all the
files that depended on it. Might be okay for a small site, but could be
a nightmare for a large site.
Paul
You could try to substitute all your calls to include() or require()
with SSI-includes and let your webserver do the aggregation then.
I read an article about retrieving the webservers result after
performing SSI actions but before handing this over to the application
server, but I can't remember where ...
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php