Re: Keeping file pointers open after script end

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Sat, 2007-06-09 at 22:23 +0100, Stut wrote:
> Tijnema wrote:
> > On 6/9/07, Stut <stuttle@xxxxxxxxx> wrote:
> >> Tijnema wrote:
> >> > On 6/9/07, Stut <stuttle@xxxxxxxxx> wrote:
> >> >> Tijnema wrote:
> >> >> > On 6/9/07, Stut <stuttle@xxxxxxxxx> wrote:
> >> >> >> Tijnema wrote:
> >> >> >> > Hmm, fseek seems cool, but what about FTP resources? If I open 
> >> them
> >> >> >> > with ftp_connect, do I need to fetch all data from FTP again, and
> >> >> then
> >> >> >> > just trash all data I don't need?
> >> >> >>
> >> >> >> Yes, but depending on what you're actually doing you may be able to
> >> >> >> cache enough to skip large chunks in subsequent requests.
> >> >> >>
> >> >> >> -Stut
> >> >> >
> >> >> > Well, I'm working on a script that can transfer large files over 
> >> HTTP.
> >> >> > The point is that I have a HTTP proxy at school, and so I can use 
> >> HTTP
> >> >> > only. Moreover, they blocked downloading files bigger than 1MB, and
> >> >> > so, I wanted to transfer files in packets of lets say 990KB, so that
> >> >> > they can be downloaded fine through the HTTP proxy. This means i'm
> >> >> > calling repeatly the script for a next chunk, and that's why I 
> >> wanted
> >> >> > to keep the FTP connection open.
> >> >>
> >> >> Write it to a temporary file and store that temporary filename in the
> >> >> session. When the page is called, get the filesize of the temporary 
> >> file
> >> >> and use that as the start of the next chunk to download.
> >> >>
> >> >> -Stut
> >> >
> >> > Do you think this is the best way to do?
> >> > Even for large files , let's say 1 DVD (4,7GB)?
> >>
> >> I can't think of a better way. This way you're building up the file on
> >> disk chunk by chunk, not storing much in the session and it should scale
> >> to any size of file quite well. The only thing you might have to watch
> >> is cleaning up the files created by aborted downloads.
> >>
> >> -Stut
> > 
> > What about creating a new file for each chunk?
> > Would that be faster and less resource intensive then using fseek all 
> > the time?
> 
> No, it would be more resource intensive and probably slower. Seeking 
> through a file is a very cheap thing for most OS's to do.

As long as it's not on a tape drive >:)

Cheers,
Rob.
-- 
.------------------------------------------------------------.
| InterJinn Application Framework - http://www.interjinn.com |
:------------------------------------------------------------:
| An application and templating framework for PHP. Boasting  |
| a powerful, scalable system for accessing system services  |
| such as forms, properties, sessions, and caches. InterJinn |
| also provides an extremely flexible architecture for       |
| creating re-usable components quickly and easily.          |
`------------------------------------------------------------'

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux