Re: Can I block.crawlers from seeing files with certain suffixes, i.e. .php? Thanks.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



An ideal way to do so would be to check the user agent of the crawler in your index page and then link to static content that you want the crawler to see. Minimizes cpu cycles as well as reduces the error they are seeing.
------Original Message------
From: mrl
To: users@xxxxxxxxxxxxxxxx
ReplyTo: users@xxxxxxxxxxxxxxxx
Subject:  Can I block.crawlers from seeing files with certain suffixes, i.e. .php?  Thanks.
Sent: Jul 12, 2013 11:34 AM

Is there a way to block .php from being indexed by crawlers, but allow other 
type files to be indexed?  When the crawlers access the php files, they are 
executed, creating lots of error messages (and taking up cpu cycles).  Thanks. 
- Mark 



---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@xxxxxxxxxxxxxxxx
For additional commands, e-mail: users-help@xxxxxxxxxxxxxxxx


Sent from my BlackBerry device on the Rogers Wireless Network

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@xxxxxxxxxxxxxxxx
For additional commands, e-mail: users-help@xxxxxxxxxxxxxxxx





[Index of Archives]     [Open SSH Users]     [Linux ACPI]     [Linux Kernel]     [Linux Laptop]     [Kernel Newbies]     [Security]     [Netfilter]     [Bugtraq]     [Squid]     [Yosemite News]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Samba]     [Video 4 Linux]     [Device Mapper]

  Powered by Linux