Search squid archive

Bad url sites

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I use a file called bad_url.squid to represent sites I want blocked.  I
think I have reached a limit to what it can hold as when I do a reconfigure
it could take a few minutes for the data to be scanned and processing power
gets sucked up.  I know there is dansguardian and a few other ways to
achieve a list but can squid handle such a list like that?  I think I have
19,000 sites to have blocked.

Thanks



[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux