Search squid archive

RE: Memory Error when using large acl files

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



tis 2006-02-28 klockan 15:54 +0000 skrev Paul Mattingly:

> Why does squid's memory usage increase by nearly 320MB when the file is only 9MB?

I would guess because you are using regex acl, and each line gets
compiled into a compiled regex internally to speed up the processing.

A dstdomain ACL of 600K entries or 9MB uses 37MB of memory on 64-bit
platforms or 23MB of memory on 32-bit platforms in my tests.

Startup time for parsing this dstdomain acl was about 15-20 seconds.


> Which of the redirectors/plug-ins are best for managing large blacklists if this way just won't work on this scale?

The Squid dstdomain ACL is about the fastest you can find at the moment.

The SquidGuard url ACL is the most flexible for more detailed matches
beyond only the hostname, but overhead of using a redirector is very
significant.

the regex type acls is bad performers in both. Not much which can be
done about that as regex have no structure.

SquidGuard has one nice feature in that it can use db files to avoid
building the complete index in memory on startup. And due to SquidGuard
being a redirector this also saves considerably amount of memory
compared to each copy of SquidGuard building it's own in-memory index..

Regards
Henrik

Attachment: signature.asc
Description: Detta =?ISO-8859-1?Q?=E4r?= en digitalt signerad meddelandedel


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux