> > "Threading" is only realistically needed when you have to get data from multiple sources; you may as well get it all in parallel rather than sequentially to limit the amount of time your application / script is sitting stale and not doing any processing. > In the CLI you can leverage forking to the process to cover this. > When working in the http layer / through a web server you can leverage http itself by giving each query its own url and sending out every request in a single http session; allowing the web server to do the heavy lifting and multi-threading; then you get all responses back in the order you requested. Regarding leveraging http to achieve multi-threading-like capabilities, I've tried this using my own framework (each individual dynamic region of a page is automatically available as REST-ful call to the same page to facilitate ajax capabilities, and I tried using curl to parallel process each of the regions to see if the pseudo threading would by an advantage.) In my tests, the overhead of the additional http requests killed any advantage that might have been gained by generating the dynamic regions in a parallel fashion. Do you know of any examples where this actually improved performance? If so, I'd like to see them so I could experiment more with the ideas. Thanks, Adam -- Nephtali: PHP web framework that functions beautifully http://nephtaliproject.com