Search squid archive

Re: error, logs say TCP_DENIED

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Kees Hink wrote:
Amos Jeffries wrote:
Kees Hink wrote:
I'd like to make squid pass requests to pound, but i'm getting an
error: "The
requested URL could not be retrieved" (http://pastebin.org/95395). The
squid
access log says ""1266857413.088      0 127.0.0.1 TCP_DENIED/400 2212 GET
NONE:// - NONE/- text/html

My squid config is the default /etc/squid/squid.conf (Squid 2.7,
installed as
Ubuntu package), with just one extension:
cache_peer 127.0.0.1 parent 28085 0 no-query originserver name=pound_viva
cache_peer_domain pound_viva .localviva.nl
(http://pastebin.org/95397)

The localviva.nl domain is faked in my /etc/hosts. Apache has a
VirtualHost for
it, which redirects to squid at port 3128. A pound server is listening on
localhost:28085. I can reach pound directly through http, but squid
fails to
relay to it.
This appears to be a backward topology. One of the main points of using
Squid is that it relieves load pressure on heavy complicated systems
like Apache.

In our case, Apache is not the most complicated (or slowest) system. The
backend server, Plone/Zope, is. But you have a point here.

I configure squid as the front end with most sites going to apache via
cache_peer, and the ones that need non-apache services cache_peer'd to
those services directly.

I read http://wiki.squid-cache.org/SquidFaq/TroubleShooting, but found
nothing
on "TCP_DENIED" or "The requested URL could not be retrieved".

I must be missing something really basic here, like a permission
setting. Could
someone please help me out?

Apache is mangling the URLs as they go through. Your Squid server does
not know what to do with the garbage:
http://localhost:3128/VirtualHostBase/http/localviva.nl:80/vivalafocus/VirtualHostRoot/

Actually, this is not garbage. The backend server, Zope, has a tool which can
parse this kind of URL to construct meaningful URLs to return. This is a very
common setup for websites that use Zope.
http://www.zope.org/Documentation/Books/ZopeBook/2_6Edition/VirtualHosting.stx

So you think squid has a problem with this URL in itself?

What web server do you have listening on localhost port 3128 serving a domain called "localhost" ?

That is what and where Squid is being asked to fetch from at present.

Here is a how-to from the zope site for using just Squid:
  http://www.zope.org/Members/htrd/howto/squid

Reading it through, I think the apache is trying to do what a simple url-rewriter will do for Squid. For now some Squid ACL to catch the dstdomain "localhost" and port 3128 for the cache_peer_access redirection would work in your current setup.

NP: the FAQ-20 they refer to is the same wiki pages I mentioned below.

NP: the http_accel_* options are replaced in 2.6+ by "http_port ... accel vhost"


Long-term I'd still advise looking to remove from the Zope request chain, even if it sticks around for other domains. The squid rewriter can be made dynamic to work live from any kind of hosting database you like (be it an DB of client registered domains, or the zope configuration files themselves since it's on localhost too ;).


Please read through these pages and reconsider the way you have Squid
and Apache linked together:
  http://wiki.squid-cache.org/ConfigExamples/Reverse/BasicAccelerator
  http://wiki.squid-cache.org/ConfigExamples/Reverse/VirtualHosting
http://wiki.squid-cache.org/SquidFaq/ReverseProxy#Running_the_web_server_on_the_same_server


If you have any reasons why you have them linked together in the current
way, bring them up so we can advise on what else you may need to configure.

See above.

Amos
--
Please be using
  Current Stable Squid 2.7.STABLE7 or 3.0.STABLE24
  Current Beta Squid 3.1.0.16

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux