$_SESSION is same as it use serialize/unserialize, alghough there're some difference On Sat, 15 Jan 2005 10:06:58 +0100, Zouari Fourat <fourat@xxxxxxxxx> wrote: > what about using $_SESSION arrays ? > > > On Fri, 14 Jan 2005 19:28:24 -0500, Al <news@xxxxxxxxxxxxx> wrote: > > George Schlossnagle addresses exactly your requirement in his book "Advanced PHP > > Programming". > > > > Josh Whiting wrote: > > > Dear list, > > > > > > My web application (an online classifieds server) requires a set of > > > fairly large global arrays which contain vital information that most all > > > the page scripts rely upon for information such as the category list, > > > which fields belong to each category, and so on. Additionally, there are > > > a large number of function definitions (more than 13,000 lines of code > > > in all just for these global definitions). > > > > > > These global arrays and functions never change between requests. > > > However, the PHP engine destroys and recreates them every time. After > > > having spent some serious time doing benchmarking (using Apache Bench), > > > I have found that this code takes at least 7ms to parse per request on > > > my dual Xeon 2.4ghz server (Zend Accelerator in use*). This seriously > > > cuts into my server's peak capacity, reducing it by more than half. > > > > > > My question is: is there a way to define a global set of variables and > > > functions ONCE per Apache process, allowing each incoming hit to run a > > > handler function that runs within a persistent namespace? OR, is it > > > possible to create some form of shared variable and function namespace > > > that each script can tap? > > > > > > AFAIK, mod_python, mod_perl, Java, etc. all allow you to create a > > > persistent, long-running application with hooks/handlers for individual > > > Apache requests. I'm surprised I haven't found a similar solution for > > > PHP. > > > > > > In fact, according to my work in the past few days, if an application > > > has a large set of global functions and variable definitions, mod_python > > > FAR exceeds the performance of mod_php, even though Python code runs > > > significantly slower than PHP code (because in mod_python you can put > > > all these definitions in a module that is loaded only once per Apache > > > process). > > > > > > The most promising prospect I've come across is FastCGI, which for Perl > > > and other languages, allows you to run a while loop that sits and > > > receives incoming requests (e.g. "while(FCGI::accept() >= 0) {..}"). > > > However, the PHP/FastCGI modality seems to basically compare to mod_php: > > > every request still creates and destroys the entire application > > > (although the PHP interpreter itself does persist). > > > > > > Essentially I want to go beyond a persistent PHP *interpreter* (mod_php, > > > PHP/FastCGI) and create a persistent PHP *application*... any > > > suggestions? > > > > > > Thanks in advance for any help! > > > Regards, > > > J. Whiting > > > > > > * - Please note that I am using the Zend Accelerator (on Redhat > > > Enterprise with Apache 1.3) to cache the intermediate compiled PHP code. > > > My benchmarks (7ms+) are after the dramatic speedup provided by the > > > accelerator. I wouldn't even bother benchmarking this without the > > > compiler cache, but it is clear that a compiler cache does not prevent > > > PHP from still having to run the (ableit precompiled) array and function > > > definition code itself. > > > > -- > > PHP General Mailing List (http://www.php.net/) > > To unsubscribe, visit: http://www.php.net/unsub.php > > > > > > -- > PHP General Mailing List (http://www.php.net/) > To unsubscribe, visit: http://www.php.net/unsub.php > > -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php