Actually, I am checking the daily download limit per IP via squid by evaluating the access.log in a script, run be cron. This had the deficiency of having a delay and only to HIT after completion of the download. The delay could be reduced simply by checking the access.log more often, and by using max_reply_body_size this overextension now is in aceptable range. However, it seems, some smart guys start several downloads in parallel. Is there any chance to catch these ones ? It is not possible to limit connections per IP/user, because this would harm standard browsing. I am already using delay_pools, to throttle the downloads, which means, these paralle downloads last quite some time. So, something like identification of several long lasting connections for the same client would be a possibility, I guess, but how to do this one using squid ? -- View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/HOWTO-Limit-number-of-parallel-downloads-AND-accumulated-download-volume-per-IP-tp4657425.html Sent from the Squid - Users mailing list archive at Nabble.com.