Re: Allowing Robots.txt

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



> On October 10, 2011 12:45 , Matt <lm7812@xxxxxxxxx> wrote:
>>
>> I want to restrict http access to the server to certain subnets,
>> require SSL and a username and password.  The exception is the
>> robots.txt file.  I want to allow anyone access to that.  How do I
>> tell it not to enforce a password or SSL only on robots.txt?
>
> Use the "Satisfy any" directive so that httpd will accept EITHER the
> host-based access control ("Allow from all") OR the user authentication
> ("Require valid-user") instead of requiring both of them as it does by
> default ("Satisfy all").  See the example at
> https://httpd.apache.org/docs/2.2/mod/core.html#satisfy

That worked, thanks.  I also had to add "RewriteCond %{REQUEST_URI}
!=/robots.txt" to exempt it from SSL.

One other thing though.  Suppose I want to exempt certain directories
from requiring a password but still leave all remaining restrictions.
I have this there:

AuthName "Restricted Area"
AuthType Basic
AuthUserFile /var/www/.htpasswd
AuthGroupFile /dev/null
require valid-user

Is there a way to exempt say /downloads/ directory?

---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@xxxxxxxxxxxxxxxx
   "   from the digest: users-digest-unsubscribe@xxxxxxxxxxxxxxxx
For additional commands, e-mail: users-help@xxxxxxxxxxxxxxxx




[Index of Archives]     [Open SSH Users]     [Linux ACPI]     [Linux Kernel]     [Linux Laptop]     [Kernel Newbies]     [Security]     [Netfilter]     [Bugtraq]     [Squid]     [Yosemite News]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Samba]     [Video 4 Linux]     [Device Mapper]

  Powered by Linux