I wrote a PHP extension a while ago that implements "executor" persistence for scalar variables and constants. I never looked much into persisting objects, arrays or resources but it would be a useful addition to the extension if someone wants to contribute. I haven't updated the Web site with the latest version that compiles under 4.3.x, but if you're interested I can send you the files that changed in the meantime. It has been immensely useful for my projects. http://pwee.sourceforge.net Lance -----Original Message----- From: Josh Whiting [mailto:jw-php@xxxxxxxxxxxxxx] Sent: Monday, January 03, 2005 11:28 AM To: php-general@xxxxxxxxxxxxx Subject: Persistent PHP web application? Dear list, My web application (an online classifieds server) requires a set of fairly large global arrays which contain vital information that most all the page scripts rely upon for information such as the category list, which fields belong to each category, and so on. Additionally, there are a large number of function definitions (more than 13,000 lines of code in all just for these global definitions). These global arrays and functions never change between requests. However, the PHP engine destroys and recreates them every time. After having spent some serious time doing benchmarking (using Apache Bench), I have found that this code takes at least 7ms to parse per request on my dual Xeon 2.4ghz server (Zend Accelerator in use*). This seriously cuts into my server's peak capacity, reducing it by more than half. My question is: is there a way to define a global set of variables and functions ONCE per Apache process, allowing each incoming hit to run a handler function that runs within a persistent namespace? OR, is it possible to create some form of shared variable and function namespace that each script can tap? AFAIK, mod_python, mod_perl, Java, etc. all allow you to create a persistent, long-running application with hooks/handlers for individual Apache requests. I'm surprised I haven't found a similar solution for PHP. In fact, according to my work in the past few days, if an application has a large set of global functions and variable definitions, mod_python FAR exceeds the performance of mod_php, even though Python code runs significantly slower than PHP code (because in mod_python you can put all these definitions in a module that is loaded only once per Apache process). The most promising prospect I've come across is FastCGI, which for Perl and other languages, allows you to run a while loop that sits and receives incoming requests (e.g. "while(FCGI::accept() >= 0) {..}"). However, the PHP/FastCGI modality seems to basically compare to mod_php: every request still creates and destroys the entire application (although the PHP interpreter itself does persist). Essentially I want to go beyond a persistent PHP *interpreter* (mod_php, PHP/FastCGI) and create a persistent PHP *application*... any suggestions? Thanks in advance for any help! Regards, J. Whiting * - Please note that I am using the Zend Accelerator (on Redhat Enterprise with Apache 1.3) to cache the intermediate compiled PHP code. My benchmarks (7ms+) are after the dramatic speedup provided by the accelerator. I wouldn't even bother benchmarking this without the compiler cache, but it is clear that a compiler cache does not prevent PHP from still having to run the (ableit precompiled) array and function definition code itself. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php