Search squid archive

Re: url_rewrite_program and Squid 2.6STABLE

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



tis 2007-01-16 klockan 22:06 -0500 skrev Travis Derouin:

> Where the 10.xxx servers are our back-end apache servers (we're using
> Squid for load balancing and caching). It basically just checks that
> all requests are for pages on the www.wikihow.com domain, if not, it
> 301 redirects them to the same requested page on www.wikihow.com.
> We do this because we used to host wikihow on the wiki.ehow.com
> subdomain, and have since moved it over and it's important we 301
> redirect old URLs to their new www.wikihow.com domain for SEO
> purposes.

The redirect can be done by denying access and then using deny_info to
send the redirect. It's not an exact match, but quite close. With
deny_info you can redirect to a page telling the user that the sites
have been merged and which then automatically (if you like) directs the
user to the new URL.

The forwarding to the correct backend server is done by cache_peer. No
redirector needed there.

> It seems specifying a deny_info URL will send browsers a 302 URL, it's
> essential we send them a 301 redirect, in addition it's essential that
> requests for www1.wikihow.com/page2 get 301 redirected to their
> counterpart www.wikihow.com/page2.

Right.. in such case it's better to continue using a redirector / url
rewriter helper, but limit it to only the requests which need to be
redirected not the forwarded request..  See url_rewriter_access.

> I'm not sure why this version of Squid is running out of rewrite
> children, the only differences between this installation and the other
> one is that we are using epoll and it's on a 64 bit processor.

Neither am I, but it could be timing with the new version processing the
requests slightly faster or something like that.

> I'm not
> sure if this affects anything.  How much memory usage do the helper
> instances take up?

This depends on the helper. You have to measure it for your helper.


To convert the helper to the new "concurrent" style just change the
request/response format to include a request tag as first word. This
needs to be echoed back when sending the response back to Squid.

$tag = $X[0];
$url = $X[1];

  print "$tag 301:$_\n";

Regards
Henrik

Attachment: signature.asc
Description: Detta =?ISO-8859-1?Q?=E4r?= en digitalt signerad meddelandedel


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux