D. Dante Lorenso wrote:
All,
I have a file which I want to stream from PHP:
readfile($file_name);
However, this function has the problem that it reads the whole file into
memory and then tries to write it to output. Sometimes, you can hit the
memory limit in PHP before the file contents are completely output
resulting in 2M or 8M truncation of output. I have seen code posted to
the readfile user comments which hints at trying a userspace
'readfile_chunked' function to break the file into smaller parts, but
this is still not entirely what I want:
http://us2.php.net/manual/en/function.readfile.php
My specific case is this. I have a queue of very large files that I'd
like to output in series. I want to do something equivalent to this:
readfile($big_file1); // 20 MB
readfile($big_file2); // 20 MB
readfile($big_file3); // 20 MB
readfile($big_file4); // 20 MB
but I really don't want to start writing $big_file2 until $big_file1 has
been written and I don't want PHP to consume 20 MB at a time either.
Ideally I'd like to have an output buffer size of NN bytes and only let
PHP fill that buffer until the client has read enough of the bytes to
warrant another refill. That way, I might only consume about 1MB
sliding window of output.
To do this, I would think I need a function in PHP which will output a
buffered stream with blocking enabled. Can anybody point me in the
right direction?
readfile works by reading in the whole file at once - if you don't want
it to do that, you can't use readfile.
You don't need anything complicated, or am I misunderstanding the
question which is more likely..
$size = 1048576; // 1Meg.
$fp = fopen($big_file1, 'rb');
while(!feof($fp)) {
$data = fgets($fp, $size);
echo $data;
}
fclose($fp);
http://www.php.net/manual/en/function.fgets.php
--
Postgresql & php tutorials
http://www.designmagick.com/
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php