Search squid archive

Re: Multiple different parents and no default

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Markus Meyer wrote:
Chris Robertson schrieb:

Hi Chris,

What do you *want* to do?

Yeah right. I'm confused. So I try it again. My Squid is "myproxy" and
is an accelerator proxy for two webservers "jallah.image" and
"kalimba.image". Both webservers have completely different content and I
need to find a way to distinguish the client-requests.

I can tell the clinets what to request. Content from "jallah.image"
comes via http://myproxy/jallah_img/whatever.jpg and content for
"kalimba.image" comes via http://myproxy/kalimba_img/blubba.jpg. So I
could catch those with ACLs and url_regex.
But when my proxy forwards the request as they are it won't work. So I
have to rewrite the URL from http://myproxy/kalimba_img/blubba.jpg to
http://myproxy/blubba.jpg.

I tried this with the following configuration:
cache_peer jallah.image parent 80 0 no-query no-digest originserver
cache_peer kalimba.image parent 80 0 no-query no-digest originserver
# jallah ACL
acl jallah urlpath_regex ^/jallah_img
cache_peer_access jallah.image allow jallah
cache_peer_access kalimba.image deny jallah
# kalimba ACL
acl kalimba urlpath_regex ^/kalimba_img
cache_peer_access kalimba.image allow kalimba
cache_peer_access jallah.image deny kalimba
url_rewrite_program /rewrite.pl

Here code of rewrite.pl:
#!/usr/bin/perl
$|=1;
while (<>) {
        # jallah
        s@/jallah_img@@;
        # kalimba
        s@/kalimba_img@@;
        print;
}

But this doesn't work because I think that the url_rewrite_program
program runs before the ACLs. With the above setup I tested with
requests for both webservers, http://myproxy/kalimba_img/blubba.jpg and
http://myproxy/jallah_img/whatever.jpg, and "myproxy" always asked the
first listed peer "jallah.image".

Go direct?

acl jallah url_regex ^jallah_img\/
acl kalimba url_regex ^kalimba_img\/
always_direct allow !jallah !kalimba

I'm not sure since I don't really understand what this is doing.

Pick a (pseudo) random parent for requests that DON'T match those terms
in the URL?


This won't work since the content on both webservers is different.

Use one parent proxy for all requests that DON'T match match those terms
in the URL?

There is no parent proxy. I'm just trying to build a accelerator proxy
for external static servers over which I have no control.

I hope I could more or less make clear my issue.
More or less...  :o)

Hope this time I did a better job in explaining...

Cheers,
       Markus


Markus, if you are altering the URL anyway you might find this a simpler way to do the whole thing:

create two sub-domains:
 jalla.myproxy
 kalimba.myproxy

configure cache_peers with forcedomain=X option (to change the domain seen by the peer to X)

cache_peer jallah.image parent 80 0 no-query no-digest originserver forcedomain=jallah cache_peer kalimba.image parent 80 0 no-query no-digest originserver forcedomain=kalimba

(or maybe you want to forcedomain=myproxy)

use dstdomain ACL to do the split, same as now.

leave the url-path identical to that found on the source servers.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE6 or 3.0.STABLE14
  Current Beta Squid 3.1.0.7

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux