Search squid archive

RE: cache.log clientReadRequest no data to process resource temporily unavailable errors

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 




-----Original Message-----
From: Henrik Nordstrom [mailto:henrik@xxxxxxxxxxxxxxxxxxx]
Sent: Friday, July 20, 2007 3:43 PM
To: Scott B. Anderson
Cc: squid-users@xxxxxxxxxxxxxxx
Subject: Re:  cache.log clientReadRequest no data to process resource temporily unavailable errors

On fre, 2007-07-20 at 12:22 -0500, Scott B. Anderson wrote:
> I've done some google searching on this already, but I don't see an answer, just questions about it. While trying to track down this issue where all cache store writes appear to work fine (the cache_dir is working as far as I can tell) but subsequent attempts to read the files out of cache fail, all of them do, greatly increasing the time pages take to load, and in some cases, the fail to load at all.  When they do fail to finish loading (they usually start, but fail in the middle) hitting refresh in the browser sometimes works to finish loading the page as well. This problem started for me when I restarted the server for the first time in about a year, a few weeks ago.
>
> This is what is in my cache.log:
> (debug ALL,2)
> 2007/07/20 10:53:21| The reply for GET http://www.google.com/intl/en_uk/images/t4_ie.gif is ALLOWED, because it matched 'all'
> 2007/07/20 10:53:21| clientReadRequest: FD 67: no data to process
> ((11) Resource temporarily unavailable)

This is completely normal, only seen because you have enabled debugging.

The problems you are seeing is somewhere else.

Any errors in cache.log without extra debugging enabled? (i.e. just "debug_options ALL,1")

What do access.log say? (preferably with log_mime_hdrs on)

And what do the cacheability check engine say about the same URL?
http://www.ircache.net/cgi-bin/cacheability.py

Regards
Henrik


Thanks for your reply.  Just looked odd that nearly every request I could see had one of those clientReadrequest lines.

Checking the cache.log I see no errors, and the store.log is constantly RELEASE ing the same objects, like google-analytics.com/uchin.js, sometimes more than once per second.  I'm going to guess the google desktop search or toolbar creates that traffic and tells it to never cache.

As for the access.log, I just did a request to http://news.com.com :

1184958348.121     96 10.0.0.20 TCP_MISS/200 1008 GET http://i.i.com.com/cnwk.1d/i/se/css/bwp/news.css - DIRECT/204.2.225.33 text/css
 1184958348.265    143 10.0.0.20 TCP_MISS/200 21241 GET http://news.com.com/html/js/redball/s_code.js - DIRECT/216.239.122.200 application/x-javascript
 1184958348.304    723 10.0.0.20 TCP_MISS/200 418326 GET http://i.i.com.com/cnwk.1d/i/ne/pg/fd_2007/071907thx.png - DIRECT/204.2.225.43 image/png
 1184958348.304    455 10.0.0.20 TCP_MISS/200 12516 GET http://i.i.com.com/cnwk.1d/i/ne/pg/fd_2007/070307radar_guitar_88x66.png - DIRECT/204.2.225.33 image/png
 1184958348.351     47 10.0.0.20 TCP_MISS/200 438 GET http://dw.com.com/clear/c.gif? - DIRECT/216.239.122.227 image/gif
 1184958348.395    396 10.0.0.20 TCP_MISS/200 20469 GET http://i.i.com.com/cnwk.1d/Ads/631/12/excel_122x90_0418.jpg - DIRECT/204.2.225.43 image/jpeg
 1184958348.487    181 10.0.0.20 TCP_MISS/302 956 GET http://pix01.revsci.net/K05540/a3/0/3/420/1/0/113E5057A87/0/0/00000000/287578608.gif? - DIRECT/38.96.134.245 text/plain
 1184958348.555    203 10.0.0.20 TCP_MISS/200 394 GET http://wzus.ask.com/i/i.gif? - DIRECT/65.214.37.120 image/gif
 1184958348.557    292 10.0.0.20 TCP_MISS/200 31640 GET http://spe.atdmt.com/ds/4IASKASKSATE/300x250_tr_smartanswers_0507_30k.swf? - DIRECT/213.200.107.146 application/x-shockwave-flash
 1184958350.104   1602 10.0.0.20 TCP_MISS/200 1946 GET http://js.revsci.net/ins/K05540/a3/0/3/420/1/0/113E5057A87/0/0/00000000/287578608.gif? - DIRECT/209.249.142.97 image/gif


A subsequest refresh in the browser also showed a TCP_MISS:

 1184967009.696    702 10.0.0.20 TCP_MISS/200 94248 GET http://news.com.com/ - DIRECT/216.239.122.200 text/html



Ircache.net reports the following for http://news.com.com

http://news.com.com
Expires         now   (Fri, 20 Jul 2007 21:31:04 GMT)
Cache-Control             -
Last-Modified             -
ETag      -
Content-Length            - (actual size: 94197)
Server          Apache/2.0

This object has been deliberately marked stale. It doesn't have a validator present. It doesn't have a Content-Length header present, so it can't be used in a HTTP/1.0 persistent connection.

It is a very long list of images and object, including a fair about of .js javascript. Looks like its purposely setting Expires to force caches to not cache?

If I try it with proxy turn on I get this - so I'm guessing that doesn't help matters at all.


"There has been a unrecoverable error.
It probably wasn't your fault, but the following may give you an idea of what was happening when things went bad. Please mail this page to errors@xxxxxxxx

(version 1.30)

Traceback (innermost last):
  File "/httpd/cgi-bin/cacheability.py", line 902, in cgi_wrap
    cgi_main()
  File "/httpd/cgi-bin/cacheability.py", line 847, in cgi_main
    f.fetch()
  File "/usr/local/lib/python1.5/site-packages/spider.py", line 262, in fetch
    self.parse_html()
  File "/usr/local/lib/python1.5/site-packages/spider.py", line 311, in parse_html
    tmp.fetch()
  File "/usr/local/lib/python1.5/site-packages/spider.py", line 200, in fetch
    self._hdrs = Headers(i)
  File "/usr/local/lib/python1.5/site-packages/httpheaders.py", line 80, in __init__
    self.data[header_name].append(
AttributeError: objtype"







[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux