[users@httpd] authenticating robots by user-agent & double reverse lookup

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Title: Message
Good morning.
 
After reading the latest Google Webmaster blog entry (http://googlewebmastercentral.blogspot.com/2006/09/how-to-verify-googlebot.html) I got to thinking about how one might be able to configure Apache deny visits from bots that purport to be Googlebot (or anything else) but do not pass the double reverse lookup for googlebot.com.
 
Reading the docs, it looks like mod_access will do a double reverse lookup on something like:
 
Allow from googlebot.com
 
And mod_access can also respect env variables set via
 
SetEnvIf User-Agent Googlebot
 
But, how would a webmaster go about combining these to stop scammers?
 
Thanks for any tips!
 
-Brice

[Index of Archives]     [Open SSH Users]     [Linux ACPI]     [Linux Kernel]     [Linux Laptop]     [Kernel Newbies]     [Security]     [Netfilter]     [Bugtraq]     [Squid]     [Yosemite News]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Samba]     [Video 4 Linux]     [Device Mapper]

  Powered by Linux