Hi, i'm trying to find a good solution to this problem. I want download files from a directory outside DocumentRoot. This files cannot be downloaded through direct url like http://site/test.zip. It must be downloaded after user login. I know i can do that using some functions like fread() + fopen() or readfile(), than i would echo file buffer to browser with correct headers. But, reading then dumping file to browser is a big problem to server. I've made one test that shows me i will "eat" 1.8% of RAM (i've used "ps aux" at Linux, in a server with 2Gb of RAM) to download a 30Mb file at 60kb/s speed. So, imagine what a dump-php-script can do with 50 to 100 concurrently downloads. Probably i will need 1 TeraByte of RAM to provide downloads ;) Theres my question now. Is there other way to protect files against direct downloading? (Obligating users to login and denying direct-url's). I also know i can check referer by using Mod_Rewrite at Apache. But it isn't secure, since referer cannot be sent or be fake. Please, help me ;) Thank you ! -------- Script i used to test: <? $url = "test.tar.gz"; header('Content-Description: File Transfer'); header('Content-Type: application/force-download'); header("Content-Disposition: attachment; filename=\"".basename($url)."\";"); header('Content-Length: ' . filesize($url)); @readfile($url) OR die(); ?> -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php