Search squid archive

Re: Some things in the log

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 14/12/17 02:46, erdosain9 wrote:
Hi to all.
Im having some things in the log.
Like this:

-Vary object loop

When Squid looked up a URL in cache it finds that there are Vary headers to be accounted for in the storage key/ID. Doing the appropriate changes to add that Vary info Squid again finds that there are *different* Vary changes to do contradictory to the initial ones.

Cache key collisions happen sometimes and objects get replaced without affecting the Vary objects referencing them. So it can happen occasionally just from normal operations. But constant repeating of that message is a problem and IIRC there are some bugs leading to that which are not yet resolved.

-Could not parse headers from on disk object

Something has mangled up the HTTP message headers in the disk copy of a cached object.

IIRC This can happen if you are using rock cache_dir type and the message headers exceed a single database cell/slot size.

But for UFS/AUFS/diskd caches it tends to only happen due to HDD failure or some other process fiddling with the disk storage.


-varyEvaluateMatch: Oops


In all of the above cases Squid handles the problem by ignoring the cache contents and fetching fresh data from online. Which replaces the broken cache contents so that particular instance of the problem(s) disappear immediately.


ipcacheParse No Address records in response to (i supposed this is not a
problem)

It is a minor problem. A domain name required to service some transaction is not correctly setup in DNS. But there is likely nothing you can do about it. Squid reports this type of thing so you can see what happened if any clients complain, and/or report the problem to someone who may be able to fix it properly.


FYI: A quick check of the domains your log snippet mentions shows they mostly have CNAME references pointing at CloudFlare server aliases that do not exist. Or similar for other CDN services. So Squid is quite correct - there are zero IP addresses for those domain names.

The DNS is incorrect because a CNAME should not exist pointing at non-existent server name. The domain owner need to fix that so services like Squid get a proper NXDOMAIN result instead of a "success" CNAME with zero IPs.

Also, why a client is trying to fetch URLs from a non-existent server is probably a good thing to look into if you want to investigate further.


2017/12/12 16:09:54 kid1| Error negotiating SSL on FD 701:
error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify
failed (1/-1/0)

Exactly what it says. The TLS/SSL certificate presented by a server could not be validated. It could be *actually* invalid (that happens rather a lot) or just missing some issuer cert - an intermediate CA cert from the handshake or root CA from your system trusted CA set.

Seeing how often these cert messages are occuring if you have a Squid older than Squid-4 latest beta you might want to try that release out and see if these disappear or at least reduce significantly. It can auto-download those missing intermediate certs instead of erroring out.

NP: that may also reduce the loudness of those no-IPs messages.



Amos
_______________________________________________
squid-users mailing list
squid-users@xxxxxxxxxxxxxxxxxxxxx
http://lists.squid-cache.org/listinfo/squid-users




[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux