On Tue, Mar 4, 2014 at 1:56 PM, Bastien Koert <phpster@xxxxxxxxx> wrote: > Ok, so what I proposed should work fairly well. To keep spiders off, add a > robots.txt file to the webserver to block them, but hopefully this process > is hidden behind a password or session to prevent unnecessary generation. > This note from google might help .. > https://support.google.com/webmasters/answer/156449?hl=en > I had suggested using a robots.txt. I am not certain why but the project manager wants the robots to have the ability to crawl the pdf links. Thank you for the reference, I will check it out!