Re: Large(ish) scale pdf file cacheing

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Tue, Mar 4, 2014 at 1:56 PM, Bastien Koert <phpster@xxxxxxxxx> wrote:

> Ok, so what I proposed should work fairly well. To keep spiders off, add a
> robots.txt file to the webserver to block them, but hopefully this process
> is hidden behind a password or session to prevent unnecessary generation.
> This note from google might help ..
> https://support.google.com/webmasters/answer/156449?hl=en
>

I had suggested using a robots.txt. I am not certain why but the project
manager wants the robots to have the ability to crawl the pdf links.  Thank
you for the reference, I will check it out!

[Index of Archives]     [PHP Home]     [Apache Users]     [PHP on Windows]     [Kernel Newbies]     [PHP Install]     [PHP Classes]     [Pear]     [Postgresql]     [Postgresql PHP]     [PHP on Windows]     [PHP Database Programming]     [PHP SOAP]

  Powered by Linux