Yes, I thought about this, but this has a big disadvantage - the client
must wait for the file to be fully processed and compressed and then he
can start downloading. I'd like to let the client start downloading the
compressed parts while the further parts are still being processed and
compressed - it's similar to streaming concept. I hope you can see what
I mean.
Jakub Čermák
ja.cermi@xxxxxxxxxx
ICQ 159971304
Per Jessen napsal(a):
Jakub wrote:
That script generates a large text file to download, so I thought I
can gzip it somehow to make the downloads faster. The buffered way (to
load all the output to some $buffer and then echo
gzencode($buffer,6);) consumes too much memory.
You could write it to a local file, then do:
header("Content-Type: application/x-gzip");
Header("Content-Disposition: attached; filename=\"\"")
passthru("gzip -c <file>");
/Per Jessen, Zürich