Re: throttle output streamed from a file?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Jochem Maas wrote:
D. Dante Lorenso wrote:
All,

I have a file which I want to stream from PHP:

it's not that relevant, but, I don't thinking streaming is the correct term.
your merely dumping a files' content to std output.


   readfile($file_name);

the trick you need to employ involves opening the file and reading/dumping
it in suitably sized chunks.


However, this function has the problem that it reads the whole file into memory and then tries to write it to output. Sometimes, you can hit the memory limit in PHP before the file contents are completely output resulting in 2M or 8M truncation of output. I have seen code posted to the readfile user comments which hints at trying a userspace 'readfile_chunked' function to break the file into smaller parts, but this is still not entirely what I want:

   http://us2.php.net/manual/en/function.readfile.php

My specific case is this. I have a queue of very large files that I'd like to output in series. I want to do something equivalent to this:


   readfile($big_file1); // 20 MB
   readfile($big_file2); // 20 MB
   readfile($big_file3); // 20 MB
   readfile($big_file4); // 20 MB
doing that would mean the client ends up with garbage - I think -
because the client justs see one big stream of data with no way of
knowing that it constitutes multiple files.
No need to worry about that problem. The data I am streaming will be properly consumed by the client ;-)
using something like exec() to call the systems tar & gzip commands on the
files in question and then dumping out the resulting 'tarball' in chunks
as described above would be a way to achieve this.

Chunking the output is not the problem either. I know how to fread(...) and fseek(...). I can 'chunk-split' the output into smaller bytes, but that's also not the problem. The problem is that if the printing of bytes does not block during output, then the server (Apache/PHP) will shove as much data as it can into the output buffer as fast as PHP will execute. If the client is only downloading at a slow rate, then we'll end up with a backlog of bytes written by PHP but not yet consumed by the client. Assuming we have a file as large as 10 Gigabytes, that can't all fit into RAM, so where does the data go?

but I really don't want to start writing $big_file2 until $big_file1 has been written and I don't want PHP to consume 20 MB at a time either. Ideally I'd like to have an output buffer size of NN bytes and only let PHP fill that buffer until the client has read enough of the bytes to warrant another refill. That way, I might only consume about 1MB sliding window of output.

To do this, I would think I need a function in PHP which will output a buffered stream with blocking enabled. Can anybody point me in the right direction?

http://php.net/streams - only I doubt that this would give you what you
want given that the client is a webbrowser and the connection is not
directly under the control of php (it's the webservers responsibility).

this is conjecture - I'm still breaking my head on streams concepts on a
regular basis!

I think I understand quite a bit of the streams stuff, but I'm trying to figure out how to control the output buffering and how to dump a large number of bytes (absolutely huge!) to a client who may be very slow.

Dante

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux