Search squid archive

Re: How big should be cache_dir ?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi,

Mark Elsen wrote:
Hello all,

I am planning to use Squid as an HTTP accelerator to benefit from its
great caching capabilities. It will be setup on a Windows 2003 server.

Ideally, I would like to use a huge amount of disk space for caching
(the reason is I need to cache a very large amount of data long to
create and that almost never change). Something in the hundreds of
Gigabytes.

....

  The cache size should be about the amount of traffic for a week generated
by your users. Oversized cache dirs will impose vast memory
requirements (see FAQ).

I understand that. But I realized that for 100 000 entries in the cache, it takes around 15MB of memory, under quite a heavy load (around 40 HTTP request per second), which I will never have in the real environment.

I didn't think 15MB of RAM is excessive... But once again, if I have 10 times the number of entries, will I need 10 times the RAM?

My ultimate goal would be to keep the documents forever, or almost. But without going over a disk space limit, and getting rid of leat recently used documents...

I could add a temporary store function to the script used to create the documents, but since squid and its cache can do the same thing...

Thank's for you answer,

Martin Sévigny

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux