Re: Emergency! Performance downloading big files

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Try using CURL - with that you can download many links simultaneously!

On Wed, Dec 2, 2009 at 12:48 AM, Brian Dunning <brian@xxxxxxxxxxxxxxxx>wrote:

> This is a holiday-crunch emergency.
>
> I'm dealing with a client from whom we need to download many large PDF docs
> 24x7, several thousand per hour, all between a few hundred K and about 50
> MB. Their security process requires the files to be downloaded via https
> using a big long URL with lots of credential parameters.
>
> Here's how I'm doing it. This is on Windows, a quad Xeon with 16GB RAM:
>
> $ctx = stream_context_create(array('http' => array('timeout' => 1200)));
> $contents = file_get_contents($full_url, 0, $ctx);
> $fp = fopen('D:\\DocShare\\'.$filename, "w");
> $bytes_written = fwrite($fp, $contents);
> fclose($fp);
>
> It's WAY TOO SLOW. I can paste the URL into a browser and download even the
> largest files quite quickly, but the PHP method bottlenecks and cannot keep
> up.
>
> Is there a SUBSTANTIALLY faster way to download and save these files? Keep
> in mind the client's requirements cannot be changed. Thanks for any
> suggestions.
>
>
>
> --
> PHP General Mailing List (http://www.php.net/)
> To unsubscribe, visit: http://www.php.net/unsub.php
>
>


-- 
Use ROT26 for best security

[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux