Re: Preferred way of limiting direct access

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Mon, 2012-02-27 at 10:32 +0000, Lester Caine wrote:
Spring cleaning time ...

I have a number of older dynamically built sites that are still using .htaccess 
to limit access to areas that only PHP needs to read. This is the simply way of 
doing things, but I am looking to current practice, and the performance hit may 
well be something I need to be concerned about?

Best practice is using permissions in your main httpd.conf file, if you have thousands of hosts it will be resource nicer.


What is the tidiest way to limit access via the <Directory> entry ( or do I need 
multiple entries? ) so that people can download content in the storage area, but 
only access the functions (PHP pages) in the various packages?

Each package has it's own directory under root, and some packages are only using 
'internally', but despite having some directories specifically blocked in the 
robots.txt file, they are still being trawled by search engines and I think I 
need to restore the .htaccess set-up in each?

Basically should I just 'deny all' on route and then open holes to the bits that 
need to be visible? The storage directory is easy, and .php in the accessible 
packages, but it's the .js, .css, icons and style elements that seem confusing. 
In practice, the basic structure of the project may be wrong for the way Apache 
now expects to work, but keeping everything related to a bundle like 'wiki' in 
one directory allows a modular approach which IS working well otherwise.


Always deny everything by default then open up what you want accessed on a global scale, something like wikis, forums and webmail scripts often are not very nice for things like that, because they want to allow access to top directories but deny sub dirs, so a .htaccess file would be easier to manage, but if there's only a few dirs, I'd opt for httpd.conf to be cleaner.

base:
<Directory />
    AllowOverride None
    Require all denied
</Directory>

allow to access the web server root holding virtual hosts:
<Directory "/var/www/hosts">
    AllowOverride None
    Require all granted
</Directory>

to protect each hosts admin section ,  a deny entry per domain

<Directory /var/www/hosts/example.com/admin>
      Require ip 10.10.0 127.0.0.1
      Auth* stuff
      Require valid-user
</Directory>
<Directory /var/www/hosts/example.net/admin>
     Require ip 10.10.0 127.0.0.1
     Autrh* stuff
     Require valid-user
</Directory>
...etc...

Simple, but when you have stuff that has 20 subdirs and  wants to stop you accessing inc css lib foo bar etc..., your httpd.conf becomes rather large and maybe messy, so its easier to then use .htaccess to stop them in those dirs

many bots also don't care about robots.txt files, even if you have
User-agent: *
Disallow: /

That says dont traverse me, but it will always hit the root dir of the URL to check i

Note: If using 2.0/2.2 replace require ip with order deny, allow , satisfy any..... you know the drill

Attachment: signature.asc
Description: This is a digitally signed message part


[Index of Archives]     [Open SSH Users]     [Linux ACPI]     [Linux Kernel]     [Linux Laptop]     [Kernel Newbies]     [Security]     [Netfilter]     [Bugtraq]     [Squid]     [Yosemite News]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Samba]     [Video 4 Linux]     [Device Mapper]

  Powered by Linux