On Sat, 2009-02-28 at 21:46 -0800, bruce wrote: > Hi. > > Got a bit of a question/issue that I'm trying to resolve. I'm asking this of > a few groups so bear with me. > > I'm considering a situation where I have multiple processes running, and > each process is going to access a number of files in a dir. Each process > accesses a unique group of files, and then writes the group of files to > another dir. I can easily handle this by using a form of locking, where I > have the processes lock/read a file and only access the group of files in > the dir based on the open/free status of the lockfile. > > However, the issue with the approach is that it's somewhat synchronous. I'm > looking for something that might be more asynchronous/parallel, in that I'd > like to have multiple processes each access a unique group of files from the > given dir as fast as possible. > > So.. Any thoughts/pointers/comments would be greatly appreciated. Any > pointers to academic research, etc.. would be useful. Threads? Or spawn off child processes. Maybe I'm not understanding your issues well enough. Cheers, Rob. -- http://www.interjinn.com Application and Templating Framework for PHP -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php