Re: Using Curl to replicate a site

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Thu, 2009-12-10 at 16:25 +0000, Ashley Sheridan wrote:

> On Thu, 2009-12-10 at 11:25 -0500, Robert Cummings wrote:
> 
> > Joseph Thayne wrote:
> > > If the site can be a few minutes behind, (say 15-30 minutes), then what 
> > > I recommend is to create a caching script that will update the necessary 
> > > files if the md5 checksum has changed at all (or a specified time period 
> > > has past).  Then store those files locally, and run local copies of the 
> > > files.  Your performance will be much better than if you have to request 
> > > the page from another server every time.  You could run this script 
> > > every 15-30 minutes depending on your needs via a cron job.
> > 
> > Use URL rewriting or capture 404 errors to handle the proxy request. No 
> > need to download and cache the entire site if everyone is just 
> > requesting the homepage.
> > 
> > Cheers,
> > Rob.
> > -- 
> > http://www.interjinn.com
> > Application and Templating Framework for PHP
> > 
> 
> 
> Yeah, I was going to use the page request to trigger the caching
> mechanism, as it's unlikely that all pages are going to be equally as
> popular as one another. I'll let you all know how it goes on!
> 
> Thanks,
> Ash
> http://www.ashleysheridan.co.uk
> 
> 


Well I got it working just great in the end. Aside from the odd issue
with relative URLs use in referencing images and Javascripts that I had
to sort out, everything seems to be working fine and is live. I've got
it on a 12-hour refresh, as the site will probably not be changing very
often at all. Thanks for all the pointers!

Thanks,
Ash
http://www.ashleysheridan.co.uk



[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux