Re: Can I block.crawlers from seeing files with certain suffixes, i.e. .php? Thanks.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Fri, 12 Jul 2013 15:34:22 +0000 (UTC)
mrl <mrl@xxxxxxxxxxxx> wrote:
> Is there a way to block .php from being indexed by crawlers, but
> allow other type files to be indexed?  When the crawlers access the
> php files, they are executed, creating lots of error messages (and
> taking up cpu cycles).  Thanks. 

Google for "robots.txt".

-- 
D'Arcy J.M. Cain
System Administrator, Vex.Net
http://www.Vex.Net/ IM:darcy@xxxxxxx
Voip: sip:darcy@xxxxxxx

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@xxxxxxxxxxxxxxxx
For additional commands, e-mail: users-help@xxxxxxxxxxxxxxxx





[Index of Archives]     [Open SSH Users]     [Linux ACPI]     [Linux Kernel]     [Linux Laptop]     [Kernel Newbies]     [Security]     [Netfilter]     [Bugtraq]     [Squid]     [Yosemite News]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Samba]     [Video 4 Linux]     [Device Mapper]

  Powered by Linux