Hello, I'm evaluating the possibility of blocking the upload of large files (e.g. over 100-200 kB) from within a LAN (for all users except a few selected ones), while leaving all other kind of Internet activity freely available to everyone. What I have available for this is a Debian Linux box. I came to understand that I should use different tools for different protocols: HTTP -------- I'm trying to configure Squid to suit my needs. The request_body_max_size parameter unfortunately applies to all users, so I cannot use it. So I came up with the following acl: acl biguploads req_header Content-Length -i [[:space:]]*[[:digit:]]{6,} http_access deny biguploads This should block all HTTP requests larger than 99,999 bytes. I tried with some webmail forms and it seems to work, but it raises some questions: 1) does some situation exist where large HTTP outbound transfers are done without any Content-Length header? This would make it possible for users to work around my acl; 2) what happens with HTTPS? Is it subject to the same rules as HTTP, or would it pass unfiltered, as it uses the CONNECT method? Anyway, AFAICT this would require that a user puts up a secure web server outside the LAN and creates a file upload form, which is out of reach for the average user, although possible. FTP -------- Is Squid able to block big FTP uploads, or FTP uploads in general? I couldn't find any way to do it, yet... Is there some safe way to block STOR commands? If I get confirmation that this is not possible, I must block all FTP traffic within Squid, and then use another tool (like ftp-proxy) to filter it if I want users to be able to download files from FTP servers. SMTP -------- This is really not in topic with the list, but nevertheless, if anyone has any suggestions... I'm currently setting up Postfix to filter SMTP connections, I just need to configure authentication-based policies. Thanks for any suggestions you can give me. -- Ciao, Marco.