Search squid archive

Re: proxy working but not the cache + curl getting 400 error

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Sender: squid3@xxxxxxxxxxxxx
Subject: Re:  proxy working but not the cache + curl getting      400  error
Message-Id: <f117f7a91ec675c10db1c4076818103f.squirrel@xxxxxxxxxxxxxxxxxxxxx>
Recipient: chris.brain@xxxxxxxxxxxxx
--- Begin Message ---
> I was able to get things to work in Firefox.  Your tip about refresh
> always getting a MISS or REFRESH_HIT helped solve it.
>
> However, I am still seeing some different behavior in my curl.
>
> When I hit the page with Firefox I see that the page is being cached
> appropriately.  However, when I hit the same page with curl, it does
> not seem to get the cached version.  My curl is now returning the page
> correctly, however every time I connect it is not getting the cached
> version.
>
> I have confirmed that it is going through the proxy (my webserver will
> not accept traffic that isn't through the proxy), so I am a little
> confused.

Check the headers curl is sending to squid. particularly the Cache-Control
or Pragma: values.   no-cache, no-store, and private  are killers.

Amos

>
> Thanks!
> -hortitude
>
> On Tue, Mar 10, 2009 at 9:04 PM, Amos Jeffries <squid3@xxxxxxxxxxxxx>
> wrote:
>> Hortitude Eyeball wrote:
>>>
>>> I am trying to setup Squid to be a simple proxy-cache.
>>> I am seeing two strange behaviors.
>>> I have 3 machines.  I am using one as my web browser, one as my
>>> proxy-cache and the third as my web server.
>>> When I configure my web browser (Firefox) to connect through my
>>> proxy-cache to my web server I see content as expected, however it
>>> does not seem to be cached.  The web page I am using is at the bottom
>>> of this post.  When I view the web page I keep seeing the time change,
>>> so I know that it is not being cached.
>>
>> NP: pressing the refresh button n a web browser sends a special header
>> to
>> refresh the page, at minimum forcing squid to check for an updated
>> version.
>> Your web server as ircache reports sends a new object when asked about
>> modification. This will result in REFRESH_HIT or MISS.
>>
>>> Furthermore, when I use curl to through the proxy and look at the
>>> headers using the -D option I see a 400 error from the proxy server
>>> and then a 200 from the web server?  I also see a message from the
>>> squid server of "Invalid Request"
>>
>> From that description I'd guess you have a domain name which resolves to
>> your Squid box AND the web server?
>>
>> 400 from squid is probably a forwarding loop?
>>
>>
>>>
>>> When I run my web page through
>>> http://www.ircache.net/cgi-bin/cacheability.py it says"
>>>
>>> This object will be fresh for 20 hr 22 min. It has a validator
>>> present, but when a conditional request was made with it, the same
>>> object was sent anyway.
>>>
>>> Can anyone help?
>>> Thanks!
>>> I am running SQUID 2.7.STABLE3 on Ubuntu.
>>>
>>>
>>> I have not changed the config much at all.  I did a grep of all
>>> options that are set in the config file and have included them here:
>>>
>>> acl all src all
>>> acl manager proto cache_object
>>> acl localhost src 127.0.0.1/32
>>> acl to_localhost dst 127.0.0.0/8
>>> acl localnet src 10.0.0.0/8     # RFC1918 possible internal network
>>> acl localnet src 172.16.0.0/12  # RFC1918 possible internal network
>>> acl localnet src 192.168.0.0/16 # RFC1918 possible internal network
>>> acl SSL_ports port 443          # https
>>> acl SSL_ports port 563          # snews
>>> acl SSL_ports port 873          # rsync
>>> acl Safe_ports port 80          # http
>>> acl Safe_ports port 21          # ftp
>>> acl Safe_ports port 443         # https
>>> acl Safe_ports port 70          # gopher
>>> acl Safe_ports port 210         # wais
>>> acl Safe_ports port 1025-65535  # unregistered ports
>>> acl Safe_ports port 280         # http-mgmt
>>> acl Safe_ports port 488         # gss-http
>>> acl Safe_ports port 591         # filemaker
>>> acl Safe_ports port 777         # multiling http
>>> acl Safe_ports port 631         # cups
>>> acl Safe_ports port 873         # rsync
>>> acl Safe_ports port 901         # SWAT
>>> acl purge method PURGE
>>> acl CONNECT method CONNECT
>>> http_access allow manager localhost
>>> http_access deny manager
>>> http_access allow purge localhost
>>> http_access deny purge
>>> http_access deny !Safe_ports
>>> http_access deny CONNECT !SSL_ports
>>> http_access allow localhost
>>> http_access allow localnet
>>> http_access deny all
>>> icp_access allow localnet
>>> icp_access deny all
>>> http_port 3128
>>> hierarchy_stoplist cgi-bin ?
>>> access_log /var/log/squid/access.log squid
>>> refresh_pattern ^ftp:           1440    20%     10080
>>> refresh_pattern ^gopher:        1440    0%      1440
>>> refresh_pattern -i (/cgi-bin/|\?) 0     0%      0
>>> refresh_pattern .               0       20%     4320
>>> acl apache rep_header Server ^Apache
>>> broken_vary_encoding allow apache
>>> extension_methods REPORT MERGE MKACTIVITY CHECKOUT
>>> hosts_file /etc/hosts
>>> coredump_dir /var/spool/squid
>>>
>>>
>>> -------------------------------------------------------------------------------------------
>>>
>>> Here is the web page I am using
>>>
>>> <?php
>>> // the time we got hit and generated content
>>> $now = time();
>>> $generatedAt = gmdate('D, d M Y H:i:s T', $now);
>>>
>>> // the last modified date (midnight on the same day of generation, as
>>> // per your business-rule)
>>> $lastModified = gmdate('D, d M Y 00:00:00 T', $now);
>>>
>>> // date of expiry (24 hours after the last modified date, as per your
>>> // business-rule)
>>> $expiresAt = gmdate('D, d M Y H:i:s T', strtotime($lastModified) +
>>> 86400);
>>>
>>> // the minimum required http headers to make Squid do what you asked is
>>> // Last-modified and Cache-control.  We need to give Cache-control the
>>> // expiry time in terms of "age" (in seconds) so we calculate that
>>> below
>>> // Optionally you could also provide the "Expires: $expiresAt" header
>>> to
>>> // tell the browser/client the same information, just in a different
>>> way
>>> // This is not required for Squid though.
>>> $maxAge = strtotime($expiresAt) - strtotime($generatedAt);
>>> header('Last-modified: ' . $lastModified);
>>> header('Cache-control: max-age=' . $maxAge);
>>> header ('Expires: '.$expiresAt);
>>>
>>> // The rest is simply informational
>>> header('Content-type: text/plain');
>>> echo "The content of this page was last modified at $lastModified\n";
>>> echo "This page was generated at $generatedAt and will be cached by
>>> Squid for $maxAge seconds until $expiresAt\n";
>>> ?>
>>
>>
>> Amos
>> --
>> Please be using
>>  Current Stable Squid 2.7.STABLE6 or 3.0.STABLE13
>>  Current Beta Squid 3.1.0.6
>>
>



--- End Message ---

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux