Re: Reading remote files

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I think, this also depends on the oprating system. I would say that any
development team would avoid loading file type data into fast memory. These
problems are all over applications. From the PHP point of view, it could
mean that file data have to be read into memory, but it could not mean that
the data have to be necceserily in a memory chip. as smart as oprerating
systems, apache and PHP are designed, I would expect some disk cashing
mechanism for large data block fom the developers.

so if u did not have any problem yet, do define a test with the average of
traffic u are expecting and see what happens. I see a pretty good chance
that there will be not so much a problem.

ralph_deffke@xxxxxxxx

"Grace Shibley" <shibleyg@xxxxxxxxx> wrote in message
news:a4d1d5260909011055o55689189n4e42af2e319fb24@xxxxxxxxxxxxxxxxx
> Are you actually having a problem with memory, or simply that you have
> to transfer it over a network first? Depending on the protocol used, you
> may be able to read it in chunks, but those chunks will still have to be
> copied to the computer that is reading it before it can be processed.
>
> The other option is to run a process in the computer where the file
> resides and only send the results over the network.
>
> Bob McConnell
>
>
> We haven't actually had a problem yet, but we don't want to run a risk of
a
> server crash.  We want to be able to call this PHP function from a
> standalone application that will get that particular chunk of data
specified
> and save it to the local drive.
> But, so far, we have been told that any function we use (fopen/fread,
> file_get_contents) will first load the entire file into memory.
>
> As far as I know then, HTTP doesn't support entering files at points
> specified by a remote user. A request is made for a file, and the server
> determines how to break it up in order to send.
>
> Apparently, with file_get_contents, you can specify an offset and a
> datasize, but it still loads the whole file first.  Is this true?
>
>
> On Tue, Sep 1, 2009 at 10:46 AM, Ashley Sheridan
> <ash@xxxxxxxxxxxxxxxxxxxx>wrote:
>
> > On Tue, 2009-09-01 at 10:43 -0700, Grace Shibley wrote:
> > > HTTP
> > >
> > > On Tue, Sep 1, 2009 at 10:36 AM, Ashley Sheridan
> > > <ash@xxxxxxxxxxxxxxxxxxxx>wrote:
> > >
> > > > On Tue, 2009-09-01 at 10:34 -0700, Grace Shibley wrote:
> > > > > Is there a way to read large (possibly 500 MB) remote files
without
> > > > loading
> > > > > the whole file into memory?
> > > > > We are trying to write a function that will return chunks of
binary
> > data
> > > > > from a file on our server given a file location, specified offset
and
> > > > data
> > > > > size.
> > > > >
> > > > > But, we have not been able to get around loading the whole file
into
> > > > memory
> > > > > first.  Is there a way to do this??
> > > >
> > > > What sort of remote file is it, i.e. how are you remotely connecting
to
> > > > it? FTP, HTTP, SSH?
> > > >
> > > > Thanks,
> > > > Ash
> > > > http://www.ashleysheridan.co.uk
> > > >
> > > >
> > > >
> > > >
> > As far as I know then, HTTP doesn't support entering files at points
> > specified by a remote user. A request is made for a file, and the server
> > determines how to break it up in order to send.
> >
> > Thanks,
> > Ash
> > http://www.ashleysheridan.co.uk
> >
> >
> >
> >
>



-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux