Amos Jeffries wrote:
Matthew Morgan wrote:
Amos Jeffries wrote:
Michael Bowe wrote:
-----Original Message-----
From: Matthew Morgan [mailto:atcs.matthew@xxxxxxxxx]
Sent: Saturday, 14 November 2009 7:59 AM
To: Squid Users
Subject: Re: Re: ubuntu apt-get update 404
Apparently I only get the dropped .bz2 extensions when using squid
transparently, which is how our network is set up. If I manually
specify http_proxy on my workstation to point to squid directly, I
don't
have any problems with apt-get update. Has anyone ever heard of
this?
Here's my updated squid config (this is 3.0-STABLE20, btw).
I've been having perhaps related problems with Debian servers
behind Squid
3.1.0.14 TPROXY
I am not getting 404's but am intermittently seeing "invalid reply
header"
errors. eg :
Failed to fetch
http://backports.org/debian/dists/etch-backports/main/binary-amd64/Packages.
gz The HTTP server sent an invalid reply header
Err http://security.debian.org lenny/updates Release.gpg
The HTTP server sent an invalid reply header [IP: 150.203.164.38 80]
W: Failed to fetch
http://security.debian.org/dists/lenny/updates/Release.gpg The
HTTP server
sent an invalid reply header [IP: 150.203.164.38 80]
As you say, if I specify HTTP_PROXY= to go direct to the cache
rather than
transparent then all works fine
Michael.
I wonder. Is that actually 3.1.0.14 direct to origin? or perhapse
going through some older sub-cache?
Are the two of you able to provide me with "tcpdump -s0" traces of
the data between apt and squid please? particularly for the
transparent mode problems.
Amos
Ok, it seems to happen in stages. The first time I run apt-get
update after switching to 3.x, it's hit or miss. Sometimes it's
perfect, sometimes I get errors. After that, I get errors in two
stages. Here's what happens:
Either:
apt-get update #1 - no errors
apt-get update #2 - invalid header, and sometimes 404 errors
apt-get update #3 and above - 404 errors only
or:
apt-get update #1 - invalid header, and sometimes 404 errors
apt-get update #2 and above - 404 errors only
The dump files I have uploaded match the second set of
circumstances. server1.dump and client1.dump are from the first
apt-get update after switching, and I got an invalid header error +
404 errors. server2.dump and client2.dump came from the second
apt-get update attempt, and only 404 errors were returned.
I hope this helps! Let me know if you need anything else. Just a
reminder, on my setup I only have 1 squid server with 1 cache
directory. For comparison, my server is Ubuntu 9.04 running kernel
2.6.28-16-server. I am not using TPROXY.
Here are the files (I tried to attach them, but mailer-daemon kicked
the email)
http://lithagen.dyndns.org/server1.dump
http://lithagen.dyndns.org/client1.dump
http://lithagen.dyndns.org/server2.dump
http://lithagen.dyndns.org/client2.dump
Well, good news and sad news.
Both traces show the same problems.
The 404 is actually being generated by the us.archive.ubuntu.com
server itself. There is something broken at the mirror or in apts
local sources.list URLs.
So does squid 3.x have a different user agent string or something?
Everything works fine with the exact same sources.list when using squid
2.7, so there shouldn't be anything wrong with the file.
us.archive.ubuntu.com must be treating squid 3.x different somehow, right?
Squid-3.0 still has the deprecated default for caching of 404/5xx
results for 5 minutes. You may get less of those errors and other
temporary errors by adding this to your squid.conf:
negative_ttl 0 seconds
The invalid header problem appears to be a minor issue (should be no
bad effect from it) caused by Squid sending back a Proxy-Connection:
header to apt. That is meant to be Connection: on intercepted requests.
Now fixed for the next release. Thank you.
Amos
Glad to have helped squish a bug!