On Thursday 16 June 2016 at 21:11:50, Alfredo Rezinovsky wrote: > Well.. I tried. > I need to ban 8613 URLs. Because a law. Have you considered https://www.urlfilterdb.com/products/ufdbguard.html ? > If I put one per line in a file and set the filename for an url_regex acl > it works. But when the traffic goes up the cpu load goes 100% (even using > workers) and the proxy turns unusable. Er, I'm not surprised. > I tested and saw my squid can't parse regexes with more than 8192 > characters. > I managed to combine the 8000 uris in 34 regexes using a ruby gem, and the > cpu load stays almost at the same level it is without any acl (same > traffic). That must be *way* past anything to be described as "maintainable". > the regex is: Er, thanks, that confirms my suspicions above :) Antony. -- Behind the counter a boy with a shaven head stared vacantly into space, a dozen spikes of microsoft protruding from the socket behind his ear. - William Gibson, Neuromancer (1984) Please reply to the list; please *don't* CC me. _______________________________________________ squid-users mailing list squid-users@xxxxxxxxxxxxxxxxxxxxx http://lists.squid-cache.org/listinfo/squid-users