RE: handling large files w/readfile

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Jason Wong wrote:

Are you using the above code on its own (ie not within some other code that may affect the memory usage)?

Well, herethe entire file (It is pretty short - only a 2 pages, but sorry in advance if anyone considers this bad form).


site is called with something like

http://blackfin.uclinux.org/frs/download.php/123/STAMP.jpg

Files are stored in:

$sys_upload_dir.$group_name.'/'.$filename

-- frs/download.php ---------------------------------------------
<?php
/**
* GForge FRS Facility
*
* Copyright 1999-2001 (c) VA Linux Systems
* The rest Copyright 2002-2004 (c) GForge Team
* http://gforge.org/
*
* @version $Id: download.php,v 1.6 2004/10/08 23:05:29 gsmet Exp $
*
* This file is part of GForge.
*
* GForge is free software; you can redistribute it and/or modify
* it under the terms of the GNU General Public License as published by
* the Free Software Foundation; either version 2 of the License, or
* (at your option) any later version.
*
* GForge is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with GForge; if not, write to the Free Software
* Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA */


$no_gz_buffer=true;

require_once('pre.php');

$arr=explode('/',$REQUEST_URI);
$file_id=$arr[3];

$res=db_query("SELECT frs_file.filename,frs_package.is_public,
        frs_file.file_id,groups.unix_group_name,groups.group_id
        FROM frs_package,frs_release,frs_file,groups
        WHERE frs_release.release_id=frs_file.release_id
        AND groups.group_id=frs_package.group_id
        AND frs_release.package_id=frs_package.package_id
        AND frs_file.file_id='$file_id'");

if (db_numrows($res) < 1) {
        Header("Status: 404");
        exit;
}

$is_public =db_result($res,0,'is_public'); $group_name=db_result($res,0,'unix_group_name');
$filename = db_result($res,0,'filename'); $release_id=db_result($res,0,'release_id');
$group_id = db_result($res,0,'group_id');


$Group =& group_get_object($group_id);
if (!$Group || !is_object($Group) || $Group->isError()) {
        exit_no_group();
}

if(!$Group->isPublic()) {
        session_require(array('group' => $group_id)); }

// Members of projects can see all packages // Non-members can only see public packages
if(!$is_public) {
if (!session_loggedin() || (!user_ismember($group_id) && !user_ismember(1,'A'))) {
exit_permission_denied();
}
}


/*
echo $group_name.'|'.$filename.'|'.$sys_upload_dir.$group_name.'/'.$filename;
if (file_exists($sys_upload_dir.$group_name.'/'.$filename)) {
echo '<br />file exists';
passthru($sys_upload_dir.$group_name.'/'.$filename);
}
*/
if (file_exists($sys_upload_dir.$group_name.'/'.$filename)) {
Header('Content-disposition: filename="'.str_replace('"', '', $filename).'"');
Header("Content-type: application/binary");
length = filesize($sys_upload_dir.$group_name.'/'.$filename);
Header("Content-length: $length");


        # Here is where all the problems start
        readfile($sys_upload_dir.$group_name.'/'.$filename);

        if (session_loggedin()) {
                s =& session_get_user();
                us=$s->getID();
        } else {
                us=100;
        }

res=db_query("INSERT INTO frs_dlstats_file (ip_address,file_id,month,day,user_id)
VALUES ('$REMOTE_ADDR','$file_id','".date('Ym')."','".date('d')."','$us')");
} else {
Header("Status: 404");
}


?>
=============================================

If this runs for awhile things go very bad. This seems to be related to a specific download manager called NetAnts that seems to be popular in China.
http://www.netants.com/


Which attempts to open the same url for downloading 10-15 times at the same instant.

If I replace things with:

==== snip =====
if (file_exists($sys_upload_dir.$group_name.'/'.$filename)) {
# if the file is too big to download (10Meg) - use a different method than php
$length = filesize($sys_upload_dir.$group_name.'/'.$filename);
Header('Content-disposition: filename="'.str_replace('"', '', $filename).'"');
Header("Content-type: application/binary");
Header("Content-length: $length");


        fp = fopen($sys_upload_dir.$group_name.'/'.$filename,'rb');
        buff="0";
        while (!feof($fp)) {
                buff = fread($fp, 4096);
                print $buff;
        }
        unset($buff);
        fclose ($fp);

===  snip - rest is the same =====

I get exactly the same problem - I come back and there are 2-3-4 apache processes that are consuming memory the size of the largest downloads.

The only way I can make things work with large downloads is to use this:

==== snip ====
if (file_exists($sys_upload_dir.$group_name.'/'.$filename)) {
# if the file is too big to download (10Meg) - use a different method than php
$length = filesize($sys_upload_dir.$group_name.'/'.$filename);
if ($length >= 10485760 ) {
out = "http://downloads.".$sys_default_domain."/".$group_name."/".$filename;
Header("Location: ".$out);
} else {
# less than 10Meg - download with php
Header('Content-disposition: filename="'.str_replace('"', '', $filename).'"');
Header("Content-type: application/binary");
Header("Content-length: $length");
fp = fopen($sys_upload_dir.$group_name.'/'.$filename,'rb');
buff="0";
while (!feof($fp)) {
buff = fread($fp, 4096);
print $buff;
}
unset($buff);
fclose ($fp);
}


==== snip - rest is the same =====

Someone suggested apache_child_terminate - but this function doesn't seem to be available to me.

Thanks in advance.

-Robin

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux