On 5/9/06, D. Dante Lorenso <dante@xxxxxxxxxxxxxx> wrote:
All, I have a file which I want to stream from PHP: readfile($file_name); However, this function has the problem that it reads the whole file into memory and then tries to write it to output. Sometimes, you can hit the memory limit in PHP before the file contents are completely output resulting in 2M or 8M truncation of output. I have seen code posted to the readfile user comments which hints at trying a userspace 'readfile_chunked' function to break the file into smaller parts, but this is still not entirely what I want: http://us2.php.net/manual/en/function.readfile.php My specific case is this. I have a queue of very large files that I'd like to output in series. I want to do something equivalent to this: readfile($big_file1); // 20 MB readfile($big_file2); // 20 MB readfile($big_file3); // 20 MB readfile($big_file4); // 20 MB but I really don't want to start writing $big_file2 until $big_file1 has been written and I don't want PHP to consume 20 MB at a time either. Ideally I'd like to have an output buffer size of NN bytes and only let PHP fill that buffer until the client has read enough of the bytes to warrant another refill. That way, I might only consume about 1MB sliding window of output. To do this, I would think I need a function in PHP which will output a buffered stream with blocking enabled. Can anybody point me in the right direction? Dante -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
I'm probably way off base on what you're trying to do, but maybe this will help: http://us2.php.net/manual/en/function.fseek.php -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php