On Tue, May 9, 2006 11:11 pm, D. Dante Lorenso wrote: > Will 'echo' block until the client has consumed the whole $size amount > of data? If not, how fast will your while loop execute? If > file_size($big_file1) exceeds 1 TB, does your server end up sucking up > all available memory? Or does PHP crash after hitting a memory limit? I can't be 100% certain, but I'm pretty sure the buffering from client and Apache and PHP for a simple fopen/fead;echo loop is going to resolve your problems, and take care of blocking. You're basically confusing "blocking" which all the streams do by default unless you turn it OFF on purpose, with "RAM limitation of trying to suck in a 2G file" They're not the same problem at all. So, the short answer is: Yes, almost-for-sure, echo will block "enough" that your won't run out of RAM. You can control the chunk-size in your fread call, basically, to get whatever RAM/performance ratio you like. I think 2048 is recommended as it matches in internal OS buffers. -- Like Music? http://l-i-e.com/artists.htm -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php