Re: large files and readfile

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Mon, Feb 06, 2006 at 09:21:22AM -0800, Daniel Bondurant wrote:
> I am using php and readfile() to control the download of large files;  
> These files can be up to 1GB.    There is nothing really exciting or  
> special about the script itself.
> 
> The problem I am running into is php is loading the entire file into  
> apache's memory as the file is being read - this seems quite  
> unnecessary.   I tired it with fpassthru() as well, with the same  
> result.
> 
> Why is php loading the entire file into memory?  Is there another/ 
> better way to download the files (other that resorting to  
> mod_rewrite) that won't use up necessary memory?
> 
> thanks
>  - daniel
> 

Try 'fread' instead. You can control how many bytes are read.
Something like:

// get contents of a file into a string
$filename = "/usr/local/something.txt";
$handle = fopen($filename, "r");
do {
    $contents = fread($handle, <number of bytes to read>);
} while ($contents !== false);
fclose($handle);

What are you doing with the file? Do you want to output it as you
read it? You will need to send the contents of $contents to the
browser and flushing buffers.

-- 
Jim Kaufman
Linux Evangelist
public key 0x6D802619
CCNA, CISSP# 65668

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux