Search squid archive

squid won't return cached even with refresh_pattern extra options override-lastmod override-expire ignore-reload ignore-no-store ignore-private store-stale

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi,

Probably, me missing on something silly or it can't be done but I don't know why but squid won't return the cached version even when I turn all override options ON in refresh_pattern. It's an API call where we call many of the same requests and by knowing it we would like to stop those calls to go out if it's already been sent once.
With debug, I can see the rule is matched and the cache is fresh but still in access.log is TCP_REFRESH_MODIFIED

squid conf:
refresh_pattern -i <URL> 4320 80% 129600 override-lastmod override-expire ignore-reload ignore-no-store ignore-private store-stale

curl headers:
curl --insecure --verbose --request GET --url 'URL' >/dev/null
* TCP_NODELAY set
* ALPN, offering h2
* ALPN, offering http/1.1
* successfully set certificate verify locations:
*   CAfile: /etc/ssl/certs/ca-certificates.crt
 CApath: /etc/ssl/certs
} [5 bytes data]
* TLSv1.3 (OUT), TLS handshake, Client hello (1):
} [512 bytes data]
* TLSv1.3 (IN), TLS handshake, Server hello (2):
{ [122 bytes data]
* TLSv1.3 (IN), TLS handshake, Encrypted Extensions (8):
{ [6 bytes data]
* TLSv1.3 (IN), TLS handshake, Certificate (11):
{ [1956 bytes data]
* TLSv1.3 (IN), TLS handshake, CERT verify (15):
{ [78 bytes data]
* TLSv1.3 (IN), TLS handshake, Finished (20):
{ [52 bytes data]
* TLSv1.3 (OUT), TLS change cipher, Change cipher spec (1):
} [1 bytes data]
* TLSv1.3 (OUT), TLS handshake, Finished (20):
} [52 bytes data]
* SSL connection using TLSv1.3 / TLS_AES_256_GCM_SHA384

> GET URL HTTP/1.1
> Host: URL
> User-Agent: curl/7.68.0
> Accept: */*
>
{ [5 bytes data]
* TLSv1.3 (IN), TLS handshake, Newsession Ticket (4):
{ [217 bytes data]
* TLSv1.3 (IN), TLS handshake, Newsession Ticket (4):
{ [217 bytes data]
* old SSL session ID is stale, removing
{ [5 bytes data]
* Mark bundle as not supporting multiuse
< HTTP/1.1 200 OK
< Cache-Control: no-cache
< Content-Type: application/json
< X-Cloud-Trace-Context: d3c27833b8b4312ce31a2dbae7e12fd0
< Date: Wed, 24 Mar 2021 15:04:34 GMT
< Server: Google Frontend
< Content-Length: 7950
< X-Cache: MISS from server
< X-Cache-Lookup: HIT from server
< Via: 1.1 server (squid/4.14)
< Connection: keep-alive

access log:
243 172.16.230.249 TCP_REFRESH_MODIFIED/200 8328 GET URL - ORIGINAL_DST/IP application/json

cache log:
2021-03-24T15:04:34squid.710 kid1| 11,3| http.cc(982) haveParsedReplyHeaders: decided: cache positively and share because refresh check returned cacheable; HTTP status 200 e:=p2V/0x34868914670*3 
2021-03-24T15:04:34squid.710 kid1| 22,3| refresh.cc(470) refreshCheck: returning FRESH_MIN_RULE 
2021-03-24T15:04:34squid.710 kid1| 22,3| refresh.cc(455) refreshCheck: Object isn't stale.. 
2021-03-24T15:04:34squid.710 kid1| 22,3| refresh.cc(327) refreshCheck: Staleness = -1 
2021-03-24T15:04:34squid.710 kid1| 22,3| refresh.cc(199) refreshStaleness: FRESH: age (60 sec) is less than configured minimum (259200 sec) 
2021-03-24T15:04:34squid.710 kid1| 22,3| refresh.cc(166) refreshStaleness: No explicit expiry given, using heuristics to determine freshness 
2021-03-24T15:04:34squid.710 kid1| 22,3| refresh.cc(307) refreshCheck: entry->timestamp: Wed, 24 Mar 2021 15:04:34 GMT 
2021-03-24T15:04:34squid.710 kid1| 22,3| refresh.cc(305) refreshCheck: check_time: Wed, 24 Mar 2021 15:05:34 GMT 
2021-03-24T15:04:34squid.710 kid1| 22,3| refresh.cc(303) refreshCheck: age: 60 
2021-03-24T15:04:34squid.710 kid1| 22,3| refresh.cc(301) refreshCheck: Matched 'URL 259200 80%% 7776000' 
2021-03-24T15:04:34squid.710 kid1| 22,3| refresh.cc(279) refreshCheck: checking freshness of URI: https://URL
_______________________________________________
squid-users mailing list
squid-users@xxxxxxxxxxxxxxxxxxxxx
http://lists.squid-cache.org/listinfo/squid-users

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux