Re: [? BUG ?] weird thing; downloading from a php script stops at exactly 2.000.000 bytes

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



It's probably something to do with maximum memory, or something like
that, but taking into account that your method is stretching the
resources, fopen/fread may be a better solution.

I'd be curious to see the benchmarked differences  - but couldn't be
bothered at this minute doing the benchmarking atm.

On 6/10/05, Catalin Trifu <catalin@xxxxxxxxxxxxxxx> wrote:
>    Hi,
> 
>   Tried it and it works indeed, but it's quite annoying to make such tricks
> and is not the best solution either; fopen and fread are "expensive".
>    I can't say if it's a bug in PHP or some config option.
> 
> 
> C.
> 
> 
> Rory Browne wrote:
> > I've never came across that problem, but try this
> >
> > function output_file($filename){
> > $fp = fopen($filename, "r");
> > while(!feof($fp)){
> >   echo fread($fp, 1024000);
> > }
> > }
> >
> > On 6/9/05, Catalin Trifu <catalin@xxxxxxxxxxxxxxx> wrote:
> >
> >>Hi,
> >>
> >>   I installed php5 using the configue below. I tried with apache2 as well and same things.
> >>
> >>'./configure' '--prefix=/usr/local/php5' '--with-apxs=/usr/local/apache/bin/apxs' '--disable-cgi'
> >>'--with-config-file-path=/etc/php5' '--with-dom' '--with-gd' '--enable-sockets' '--enable-exif'
> >>'--with-freetype2' '--with-freetype-dir=/usr/include/freetype2' '--enable-gd-native-ttf'
> >>'--with-zlib-dir=/usr' '--with-curl' '--with-curlwrappers' '--enable-ftp' '--with-mysql=/usr'
> >>'--with-xsl' '--with-libxml-dir=/usr'
> >>
> >>   I have a script which generates a temporary catalog file, which is generated correctly having
> >>4.7MB on disk.
> >>   Then I push up the wire with readfile($filname):
> >>
> >>   header("Content-Type: text/csv");
> >>   header("Content-Disposition: attachment; filename=somfilename.csv");
> >>   header("Content-Length: ". filesize($file));
> >>
> >>   readfile($file);
> >>
> >>   I also tried with fopen.
> >>   If I try to download the file directly from apache it works, all 4.7MB are received.
> >>
> >>   As expected the browser starts the download and reports it is expecting a file of 4.7MB.
> >>   However, the download stops at 2.000.000 bytes no matter what browser I use (normally i use
> >>Firefox on Linux), no matter if php runs on apache2 or apache1.3
> >>
> >>   Is there some php config option I missed ?
> >>   Could this be from curlwrappers ?
> >>   Where could this come from ?
> >>
> >>
> >>Thanks,
> >>Catalin
> >>
> >>--
> >>PHP General Mailing List (http://www.php.net/)
> >>To unsubscribe, visit: http://www.php.net/unsub.php
> >>
> >>
> 
> --
> PHP General Mailing List (http://www.php.net/)
> To unsubscribe, visit: http://www.php.net/unsub.php
> 
>

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux