On 6/9/07, Stut <stuttle@xxxxxxxxx> wrote:
Tijnema wrote: > On 6/9/07, Stut <stuttle@xxxxxxxxx> wrote: >> Tijnema wrote: >> > On 6/9/07, Stut <stuttle@xxxxxxxxx> wrote: >> >> Tijnema wrote: >> >> > Hmm, fseek seems cool, but what about FTP resources? If I open them >> >> > with ftp_connect, do I need to fetch all data from FTP again, and >> then >> >> > just trash all data I don't need? >> >> >> >> Yes, but depending on what you're actually doing you may be able to >> >> cache enough to skip large chunks in subsequent requests. >> >> >> >> -Stut >> > >> > Well, I'm working on a script that can transfer large files over HTTP. >> > The point is that I have a HTTP proxy at school, and so I can use HTTP >> > only. Moreover, they blocked downloading files bigger than 1MB, and >> > so, I wanted to transfer files in packets of lets say 990KB, so that >> > they can be downloaded fine through the HTTP proxy. This means i'm >> > calling repeatly the script for a next chunk, and that's why I wanted >> > to keep the FTP connection open. >> >> Write it to a temporary file and store that temporary filename in the >> session. When the page is called, get the filesize of the temporary file >> and use that as the start of the next chunk to download. >> >> -Stut > > Do you think this is the best way to do? > Even for large files , let's say 1 DVD (4,7GB)? I can't think of a better way. This way you're building up the file on disk chunk by chunk, not storing much in the session and it should scale to any size of file quite well. The only thing you might have to watch is cleaning up the files created by aborted downloads. -Stut
What about creating a new file for each chunk? Would that be faster and less resource intensive then using fseek all the time? Tijnema
-- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php