Search squid archive

Re: Not all html objects are being cached

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



After some time reading the thread and getting to the bottom of it I think I have an idea on how to give another angle on the caching subject.
This is an example access.log which I got to with the help of Amos:
https://gist.github.com/elico/2ea2253ef1c09872ba90becb961acd91

It can reveal to the system administrator couple good reasons why an object wasn't cached.
It can serve both academic and real world decisions.
I believe that today squid does a nice job deciding what to cache and what to not in the limits of any LRU cache based system\software.

I think wireshark was a good choice to analyze the situation.
There is a possibility that some CMS or some web site will do things the wrong way and this software will enforce an object to be "non-cachable" but from this to break the fundamentals of HTTP there is a lot to learn.
First learn(we are here to assist) and then break if required.
There are many tools to break caching "rules" but squid tries to play the role of the most "friendly" on the Internet.
>From one hand it will give you couple very good API's and configuration but requires you, the admin to know what you are doing. When you don't know or do not understand take couple minutes to find the right document or tutorial(textual or video).
If you cannot reach these we are here for the rescue to help if needed.
Just ask!!!
The answers will be there  sooner or later in the thread.

All The Bests,
Eliezer

----
Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: eliezer@xxxxxxxxxxxx


-----Original Message-----
From: squid-users [mailto:squid-users-bounces@xxxxxxxxxxxxxxxxxxxxx] On Behalf Of boruc
Sent: Tuesday, January 24, 2017 5:53 PM
To: squid-users@xxxxxxxxxxxxxxxxxxxxx
Subject:  Not all html objects are being cached

Hi everyone,

I was wondering why some of visited pages are not being cached (I mean
"main" pages, like www.example.com). If I visit 50 pages only 10 will be
cached. Below text is from log files:

store.log:
1485272001.646 RELEASE -1 FFFFFFFF 04F7FA9EAA7FE3D531A2224F4C7DDE5A  200
1485272011        -1 375007920 text/html -1/222442 GET http://www.wykop.pl/

access.log
1485272001.646    423 10.10.10.136 TCP_MISS/200 223422 GET
http://www.wykop.pl/ - DIRECT/185.66.120.38 text/html

According to Squid Wiki: "if a RELEASE code was logged with file number
FFFFFFFF, the object existed only in memory, and was released from memory."
- I understand that requested html file wasn't saved to disk, but why?

I'm also posting my squid.conf below. I'd be grateful for your answers!


acl manager proto cache_object
acl localhost src 127.0.0.1/32 ::1
acl to_localhost dst 127.0.0.0/8 0.0.0.0/32 ::1
acl my_network src 192.168.0.0/24
acl my_phone src 192.168.54.0/24
acl my_net dst 192.168.0.0/24
acl mgr src 10.48.5.0/24
acl new_net src 10.10.10.0/24
acl ex_ft url_regex -i "/etc/squid3/excluded_filetypes.txt" 
acl ex_do url_regex -i "/etc/squid3/excluded_domains.txt" #doesnt include
any of 50 visited pages

acl SSL_ports port 443
acl Safe_ports port 80		# http
acl Safe_ports port 21		# ftp
acl Safe_ports port 443		# https
acl Safe_ports port 70		# gopher
acl Safe_ports port 210		# wais
acl Safe_ports port 1025-65535	# unregistered ports
acl Safe_ports port 280		# http-mgmt
acl Safe_ports port 488		# gss-http
acl Safe_ports port 591		# filemaker
acl Safe_ports port 777		# multiling http
acl CONNECT method CONNECT

http_access allow my_network
http_access allow my_phone
http_access allow my_net
http_access allow mgr
http_access allow new_net
http_access allow manager localhost
http_access deny manager

http_access deny !Safe_ports

http_access deny CONNECT !SSL_ports

http_access allow localhost
http_access allow all

http_port 3128

maximum_object_size_in_memory 1024 KB

cache_dir ufs /var/spool/squid3 1000 16 256

cache_store_log /var/log/squid3/store.log

coredump_dir /var/spool/squid3

cache deny ex_ft
cache deny ex_do

refresh_pattern ^ftp:		1440	20%	10080
refresh_pattern ^gopher:	1440	0%	1440
refresh_pattern -i (/cgi-bin/|\?) 0	0%	0
refresh_pattern (Release|Packages(.gz)*)$      0       20%     2880

refresh_pattern .               1000       20%     4320

request_header_access Accept-Encoding deny all



--
View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/Not-all-html-objects-are-being-cached-tp4681293.html
Sent from the Squid - Users mailing list archive at Nabble.com.
_______________________________________________
squid-users mailing list
squid-users@xxxxxxxxxxxxxxxxxxxxx
http://lists.squid-cache.org/listinfo/squid-users

_______________________________________________
squid-users mailing list
squid-users@xxxxxxxxxxxxxxxxxxxxx
http://lists.squid-cache.org/listinfo/squid-users




[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux