Search squid archive

RE: [squid-users] Squid slows down when a file with more than 250 00 URLs to block is loaded

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



> -----Original Message-----
> From: Pablo Romero [mailto:psromerozu@xxxxxxxxxxx]
> Sent: Friday, January 28, 2005 2:20 PM
> To: squid-users@xxxxxxxxxxxxxxx
> Subject: [squid-users] Squid slows down when a file with more than 25000
> URLs to block is loaded
> 
> 
> Hello
> 
> I am running Squid 2.5 Stable 6 on a Pentium4 1.8 Ghz and 256 Mbytes RAM.
I 
> am trying to put this proxy in a production environment soon, but,
although 
> squid is performing just fine all tasks, when I began to try the ACL
tests, 
> it seems to slow down (very much) when a blacklist file is loaded. The 
> configuration is the following:
> 
> acl denegar url_regex  "/opt/squid/blacklist"
> 
> 
> This blacklist file has 25000 sites on it, and it seems to take down
squid. 
> I don't know if you guys can provide me some tips so I can tune up my
squid 
> proxy. Can you tell me if I am using the wrong hardware configuration, if
I 
> need more RAM, or if I just have to change some stuff in the squid.conf 
> file.
> 
> 
> I'd appreciate your help
> 
> 
> Regards
> 
> 
> 
> 
> Pablo Romero

url_regex is a last resort, and very CPU intensive.  Use dstdomain instead
in any instance that you can (i.e. instead of "url_regex site\.domain" use
"dstdomain .site.domain") and you'll find performance improves greatly.

Chris


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux