On Fri, 12 Jul 2013 15:34:22 +0000 (UTC) mrl <mrl@xxxxxxxxxxxx> wrote: > Is there a way to block .php from being indexed by crawlers, but > allow other type files to be indexed? When the crawlers access the > php files, they are executed, creating lots of error messages (and > taking up cpu cycles). Thanks. Google for "robots.txt". -- D'Arcy J.M. Cain System Administrator, Vex.Net http://www.Vex.Net/ IM:darcy@xxxxxxx Voip: sip:darcy@xxxxxxx --------------------------------------------------------------------- To unsubscribe, e-mail: users-unsubscribe@xxxxxxxxxxxxxxxx For additional commands, e-mail: users-help@xxxxxxxxxxxxxxxx