Re: Delivering large files via PHP (>300MB)

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Tuesday 14 December 2004 15:53, Richard Davey wrote:
> Hello rouvas,
>
> Tuesday, December 14, 2004, 1:33:07 PM, you wrote:
>
> r> Why don't you take the PHP out of the loop entirely?
> r> Make a dir into the Apache area with a custom .htaccess
> r> (with usernames/passwords, etc) and put the required files there.
>
> Then the files have to be within the web root and it'll only take one
> person to share out the username/password.

Not to the web root, but to an arbitrary named on-the-fly created dir 
protected with a *custom* (and different for each dir) .htaccess file (and 
accompanying htpasswd entries). Then, there would be no single pass to share.
You can even make it time-limited, so as to expire after a predefined period.
And anyway, what's from stopping the user to share the file after it has been 
downloaded into theri machine?

> It needs controlling as to
> who can download and how many times. PHP has to be in the loop
> somewhere (although granted, not for the actual file delivery).

Sure you need to control it. But you need to control when, how (and if) the 
file gets to the client, not what or from where it gets served. which in my 
mind calls for something on the client side along the lines of your prog.

>
> r> From the thread I understood that you don't split the file into smaller
> r> chunks, but instead server chunks from the same big file. This is bad
> r> practice, as I've found out from personal experience. It is better to
> serve r> small files as they finish earlier and free the server processes.
>
> What's the difference between serving a small 1MB file, and reading in
> 1MB of data from a 300MB file, closing that read operation and then
> outputting the result? I cannot see how actually splitting the file
> into 1MB chunks on the server will make it finish earlier. 1MB of data
> is 1MB of data, regardless how PHP read it in. The only real advantage
> might be in disk seek times however, so PHP wouldn't have to seek into
> the middle of a large file for example.

Assuming, that (a) you are sharing the same big file and (b) the number of 
users downloading is "significant", then :
(a) PHP is slower than Apache
(b) Apache can cache the 1MB files, at least some of them, and serve them to 
the next client

> r> Also, this would allow users that already have other download
> accelerators r> installed to grab the files.
>
> Download accelerators need a direct link to the file itself. The
> moment we have that, we're back to square one again.

The url to the download accelerators could contain authentication info.

> If it was that simple then when you buy something like a Symantec

[...snip...]

I don't think it's complicated. BTW, I don't find your solution compilcated, 
on the contrary is quite straightforward. and to be honest I don't think 
there is any reason to change it.
I'm only replying to offer an alternative...

-Stathis

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux