so here's the scenario.. I have a site that uses php with a database to offer sound files to users using streaming methods. the request page has options, allowing the user to modify the sound file in various ways, before having it sent to them Here's the problem: The method i'm using to feed the data to the user is to run the source file through various piped commands, with the resulting audio being dumped to stdout, and then using passthru in php to get that data to the enduser. here's an example, for serving an MP3 with its pitch/speed changed by sox: passthru("lame --quiet --decode \"" . $in_file . "\" - | " . "sox -V -S -t wav - -t wav - speed " . $speed_factor . " | " . "lame --quiet " . $lame_params . " - -"); This works just fine, except the problem is if the end user aborts the transfer (e.g. stops playback in the media player, cancels download of the mp3, whatever) then it leaves behind both the sox process and the decoder LAMe process along with the sh that's running them. the only process that exits is the final encoding lame process. If the sound file runs to completion, everythign exits properly. But this obviously means enough "cancelling" of downloads means the server ends up with a huge batch of stuck processes! And I even tried simply killing the 'host' sh process, and the lame and sox processes remain anyway. The only way I've been able to deal with this is manually killing the lame and sox processes directly. is there any way I can make this work, such so that if the user cancels the transfer, all relavent processes are killed rather than just the single process that's feeding output into php? -FM -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php