Search squid archive

Re: Big file (500MB) need to be downloaded many times so that it can be cached?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 4/07/2013 6:18 p.m., Makson Lee wrote:
Hi All,

The file had been downloaded successfully, but it didn't been cached,
then you tried to download it a more few times, now it been cached,
why?

When you first download it does not exist in cache (MISS). First download adds it to the cache. Second and later downloads it does exist in cache (HIT).

  Could you tell me how to debug it?

Please provide the output of "squid -v" command.

Please also provide a copy of the HTTP headers for this object as delivered by the server to Squid. And tell how big exactly the object is.


cache_replacement_policy heap GDSF
memory_replacement_policy heap GDSF
maximum_object_size 1024 MB
maximum_object_size_in_memory 16 MB
cache_dir aufs /usr/local/squid/var/cache 307200 256 256
cache_mem 6144 MB
cache_store_log none
cache_peer apserver.domain parent 9443 0 no-query originserver
name=httpsAccel ssl login=PROXYPASS sslflags=DONT_VERIFY_PEER

Option "login=PROXYPASS" should not be necessary. Only "login=PASS".


cache_peer_access httpsAccel allow all
coredump_dir /usr/local/squid/var/cache
http_port 3128
http_access allow all
https_port 9443 cert=/usr/local/squid/etc/server.pem accel
key=/usr/local/squid/etc/privkey.pem vhost
refresh_pattern .              0       20%     4320
cache_mgr admin
cachemgr_passwd 123456 all

I hope that is not your real cache manager password. If so please change it *urgently*.

buffered_logs on
visible_hostname proxyserver.domain

Amos




[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux