httpd and robots.txt

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]



would anyone out there care to share their robots.txt experience using
centos as a webserver and their robots.txt files?

i realize this is a somewhat simple exercise, yet i am sure there are both
large and small hosters out there and possibly those that have high traffic
modify their robots.txt files differently that others ???

please share if you can or care to please?

for years we have just did a * (allow all) and disallow on things like
/cgi-bin

as examples of places to visit for those out or in the know...

http://www.robotstxt.org/

http://en.wikipedia.org/wiki/Robots_exclusion_standard

http://www.google.com/robots.txt

and others...

quite frankly, there are many orgs out there that dont follow this anyways,
right?

anyone?

tia

 - rh

_______________________________________________
CentOS mailing list
CentOS@xxxxxxxxxx
http://lists.centos.org/mailman/listinfo/centos

[Index of Archives]     [CentOS]     [CentOS Announce]     [CentOS Development]     [CentOS ARM Devel]     [CentOS Docs]     [CentOS Virtualization]     [Carrier Grade Linux]     [Linux Media]     [Asterisk]     [DCCP]     [Netdev]     [Xorg]     [Linux USB]
  Powered by Linux