RE: file locking...

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



hi rob...

here's the issue in more detail..

i have multiple processes that are generated/created and run in a
simultaneous manner. each process wants to get XX number of files from the
same batch of files... assume i have a batch of 50,000 files. my issue is
how do i allow each of the processes to get their batch of unique files as
fast as possible. (the 50K number is an arbotrary number.. my project will
shrink/expand over time...

if i dump all the 50K files in the same dir, i can have a lock file that
would allow each process to sequentially read/write the lock file, and then
access the dir to get the XX files the process is needing. (each process is
just looking to get the next batch of files for processing. there's no
searching based on text in the name of the files. it's a kind of fifo queing
system) this approach could work, but it's basically sequential, and could
in theory get into race conditions regarding the lockfile.

i could also have the process that creates the files, throw the files in
some kind of multiple directory processes, where i split the 50K files into
separate dirs and somehow implement logic to allow the cient process to
fetch the files from the unique/separate dirs.. but this could get ugly.

so my issue is essentially how can i allow as close to simultaneous access
by client/child processes to a kind of FIFO of files...

whatever logic i create for this process, will also be used for the next
iteration of the project, where i get rid of the files.. and i use some sort
of database as the informational storage.

hopefully this provides a little more clarity.

thanks


-----Original Message-----
From: Robert Cummings [mailto:robert@xxxxxxxxxxxxx]
Sent: Sunday, March 01, 2009 2:50 AM
To: bruce
Cc: php-general@xxxxxxxxxxxxx
Subject: Re:  file locking...


On Sat, 2009-02-28 at 21:46 -0800, bruce wrote:
> Hi.
>
> Got a bit of a question/issue that I'm trying to resolve. I'm asking this
of
> a few groups so bear with me.
>
> I'm considering a situation where I have multiple processes running, and
> each process is going to access a number of files in a dir. Each process
> accesses a unique group of files, and then writes the group of files to
> another dir. I can easily handle this by using a form of locking, where I
> have the processes lock/read a file and only access the group of files in
> the dir based on the  open/free status of the lockfile.
>
> However, the issue with the approach is that it's somewhat synchronous.
I'm
> looking for something that might be more asynchronous/parallel, in that
I'd
> like to have multiple processes each access a unique group of files from
the
> given dir as fast as possible.
>
> So.. Any thoughts/pointers/comments would be greatly appreciated. Any
> pointers to academic research, etc.. would be useful.

Threads? Or spawn off child processes. Maybe I'm not understanding your
issues well enough.

Cheers,
Rob.
--
http://www.interjinn.com
Application and Templating Framework for PHP


--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux