Search squid archive

Re: different results every time

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi Eliezer,

Thanks for your prompt reply. We are testing our squid configuration before we use it. That said, all objects are 1 MB in size and in order to test squid we queried a sequence of files multiple times in a manner that theoretically at the end of the querying process we should get the same number of hits from cache1, cache2, cache3, and cache4.

Structure of test network is: User (using a script) -> cache1 -> cache2 -> cache3 -> cache4 -> web server (stores the queried files).

I'm gonna try the Collapsed Forwarding feature and will post back if this fixes the issue.

Thank you,
Sam

On Tue, Aug 2, 2016 at 2:03 AM, Eliezer Croitoru <eliezer@xxxxxxxxxxxx> wrote:

Hey Sam,

 

From what I understand it seems that your expectation doesn't meet the reality but I am unsure yet.

It seems that the goal is to fetch and save everything to disk and from disk, right?

 

During the time that your clients and proxy are fetching the object using squid, you cannot serve this "in-transit" content to other clients.

Collapsed Forwarding feature of squid should satisfy your use case but depends on the size of the object it might not be the right choice for you.

What objects size are we talking about?

 

Eliezer

 

----

Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: eliezer@xxxxxxxxxxxx

 

From: squid-users [mailto:squid-users-bounces@xxxxxxxxxxxxxxxxxxxxx] On Behalf Of Sam M
Sent: Tuesday, August 2, 2016 8:43 AM
To: squid-users@xxxxxxxxxxxxxxxxxxxxx
Subject: [squid-users] different results every time

 

Hi,

I'm querying lots of files through 4 cache servers connected through parent hierarchy. I clean all the caches before I start and then I query the files again in the same exact order. Weirdly, every time I check the logs, I see a different cache served a file compared with the previous test. The query process is done through a python script that uses wget through a proxy to the cache, hence the query process is really fast.

Interestingly, if I put a delay of 1 second between each query, the result will be stable and same every time I run the script.

Following a snippet from the config file after changing it too many times to make it re-produce the same results yet, that didn't help:
cache_dir ufs /var/spool/squid 9 16 256
cache_mem 0 MB
memory_pools off
cache_swap_low 100
cache_swap_high 100
maximum_object_size_in_memory 0 KB
cache_replacement_policy lru
range_offset_limit 0
quick_abort_min 0 KB
quick_abort_max 0 KB

 

Can someone shed some light on the issue and how to fix it please?

Thanks,

Sam


_______________________________________________
squid-users mailing list
squid-users@xxxxxxxxxxxxxxxxxxxxx
http://lists.squid-cache.org/listinfo/squid-users

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux