Downloading Large (100M+) Files

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi.

I have searched a few of the mailing lists, and have not found an answer.

I am working on a site that is currently running gforge ( http://gforge.org/ ). The process that is used to download files from the file repository is something like:

	Header('Content-disposition: filename="'.str_replace('"', '', $filename).'"');
	Header("Content-type: application/binary");
	$length = filesize($sys_upload_dir.$group_name.'/'.$filename);
	Header("Content-length: $length");

	readfile($sys_upload_dir.$group_name.'/'.$filename);

The issue is that readfile writes it to the output buffer before sending it to the client. When several people try to download large files at the same time (The Ant Download Manager - trys downloading things by opening 20 connections). 20 x a single 250Meg file rips through physical and swap pretty fast and crashes my machine.

Any thoughts on how to turn output buffering off? I have tried, but have not been able to get it working properly.



On a similar note, is there a portable way to determine available system memory (physical and swap)? Right now I am using something like:
=========
# ensure there is enough free memory for the download
$free = shell_exec('free -b'); $i=0; while ( $i != strlen($free) ) {
$i = strlen($free);
$free = str_replace(' ',' ',$free);
}
$free = str_replace("\n",'',$free);
$freeArray = explode(' ',$free);
$total_free = $freeArray[9] + $freeArray[18];
==========


Calling shell_exec isn't very portable to other systems.

Thanks in advance.
-Robin




--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux