On Tue, March 20, 2007 4:37 am, Manuel Vacelet wrote: > 2007/3/20, Richard Lynch <ceo@xxxxxxxxx>: >> One common pattern in PHP is to not put the file in the web tree at >> all, and write a PHP script with 'readfile' (or fopen/fread/echo >> loop >> for larger files). >> >> You can then control access to the file, and log any kind of stats >> you >> need about accessing the file. > > Yes I already do that with all my scripts that are dealing with files. > >> Once you have that, then you can also put the files on some other >> server, and use URL fopen to read them, if you like. > > Is it considered as secure ? as secure as what? I don't think you've established a baseline for comparison... That said, it's probably about as secure as almost anything needs it to be, depending on how you limit (or not) access to it. Assuming you control the other server, you can make it as secure as you like... You can start by never ever publishing that link anywhere, so nobody knows where it is. That server can also reject any requests that aren't from your web server IP (or list of IPs for a web-server farm). You could set it up with SSL and use curl instead of url fopen -- You'd probably not want to waste $$$ on a CA, so you'd need the CURLOPT stuff to not check the peer stuff. The security of the non-CA SSL is the same as that of a peered one -- It's just a question of whether you trust the box at the other end of the connection or not. If you don't trust it, then you're in trouble before you start... :-) That second server could even be on a private LAN in a guarded bank vault with a second NIC from web-server to back-end and firewall rules to refuse cross-talk. Take out a nice big insurance policy, and you could probably "safely" store credit card numbers on that puppy. How secure is secure enough? Depends what your data and application are, more than any external factor. >> I've done that for one site where a webhost a like has limited hard >> drive space, and have made it appear as if a Terabyte of music is >> available on a 500 meg site. >> >> Or you could use PHP ftp functions to shuffle them around. > > I much like this approach because we don't rely on another apache > server. We can assume that if a cracker found a security hole on the > front-end, it will be more easy to use the same exploit on the second > server. Well, yes, if they can even GET to the second server... See above. > I had a quick look on the ftp functions of php and there is an > interesting usage proposed in comments: > http://www.php.net/manual/en/function.ftp-get.php#72603 > > Coupled with ssl ftp, this could be a good solution. > > Any comments, pros, cons, ... ? I'd also consider curl before FTP, personally, as it is more flexible if you decide later to use something other than the FTP protocol. >> It depends more on what you are trying to secure, and why, than it >> does on any sort of general principle, really... And just personal >> preference on how to do this sort of thing... And your performance >> needs are a big factor, sometimes. > > Security is the major point (before performances). It's not that simple... Would you be happy with a web server that requires a human to review each HTTP request and sign off a form in triplicate before the HTTP response went out? With enough money and stupidity, you could Spec, Design, and Build that. You'd definitely have Security before Performance, assuming trusted people answering and filling out the forms. > The mains goal is to be still protected if their is an element under > attack on the application server, for instance a vulnerability in > apache (or even php according to the March month ;). What data are you protecting? If you can't (or won't) answer that, the advice we can give is basically meaningless. > I want to be protected against: > - cracker uploads a file and use a vulnerability to execute it on the > server (I can avoid it with a partition mounted without exec rights or > with another server that hosts the files). Sure. Or you could just put them outside the webtree and not write stupid PHP code that lets them get executed. And you could check the upload files for validity, to insure that they meet certain criteria of non-executable files in the first place. E.g., if you expect a JPEG, then it should have the correct first N bytes of a valid JPEG. It's gonna be dang hard for a bad guy to upload a binary file with payload that is not only executable and does something bad but also has the first N bytes of a valid JPEG. I'm not going to claim it's impossible, because some smart-ass somewhere has probably managed it... But it sure as heck is uncommon. > - cracker uses a vulnerability and obtains the same rights than the > web server (due to mod_php) she will be able to access to all the > files (at least in read mode) because the user who runs apache have to > be able to read them. Is this on a shared server? Is your PHP binary reading script dumb enough to allow them to access the files they shouldn't be accessing? I.e., there is a HUGE difference between: <?php //WRONG! readfile($_GET['filename']); ?> <?php //Better: $filename = basename($_GET['filename']); readfile("/path/to/files/they/should/get/$filename"); ?> > There are probably other things I don't imagine but I think the usage > of another server to host data is a good approach. I think it's a great approach, if the data being secured warrants it and the web application is well-written. I think it's a waste of time if the data being secured is not worth securing or the PHP script is so badly-written that jumping through the second-server hoop is no barrier at all, or, worse, opens up even MORE vulnerabilities. I (still) have no idea which camp you are in, no offense intended. -- Some people have a "gift" link here. Know what I want? I want you to buy a CD from some starving artist. http://cdbaby.com/browse/from/lynch Yeah, I get a buck. So? -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php