Search squid archive

Re: Setting up ACL for Squid as a Web Accelerator

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Nov 14, 2007 10:42 AM, Amos Jeffries <squid3@xxxxxxxxxxxxx> wrote:
> Ed Singleton wrote:
> > I'm trying to set up squid as a web accelerator behind apache for a
> > couple of slow dynamic sites I have.
>
> Well, first trouble is that accelerators should be in _front_ of the web
> server. Apache has perfectly fine caching internally for cachable
> content. All the benefit from acceleration comes from taking the load
> off before it gets near Apache.

Unfortunately my hosting company (Rackspace) uses RedHat which is
still using Apache 2.0, which doesn't have great support for caching.
Apache 2.2 is coming in a update in a few weeks, but I'm getting the
traffic this week.

I think long term I'll probably switch to a different provider and use
the Apache caching, cause Rackspace haven't really been that helpful
over this.

> >  I have lots of virtual hosts in
> > Apache and I want to be able to proxy a couple of them through squid.
> >
> > In Apache I have this for one of my virtual hosts:
> >
> > RewriteRule ^/(.*)$
> > http://127.0.0.1:3128/http://212.100.249.204:7171/$1 [L,P]
> >
> > However, when I try to access the address I get this error:
> >
> > "Access Denied.  Access control configuration prevents your request
> > from being allowed at this time."
> >
> > I'm using webmin, and I can't figure out what rules I need to setup in
> > order to allow the request.  I've even tried having an Allow All rule,
> > but that didn't make any difference.
> >
> > If anyone can give me some pointers I'd be extremely grateful.
>
> Slow:
>    client -> Apache
>
> Attempt 1:
>    client -> Apache -> Squid -> Apache
>
> now guess how much work Apace is now doing?
>
> better to try this:
>
> client -> Squid2.6 -> Apache

Thanks for the pointers on this, but Apache is running nice and fast
and doesn't need any caching, and I certainly don't want to cache all
my websites.

It's more like:

lots of:

client --fast--> Apache --fast--> website

and one:

client --fast--> Apache ---slow---> 'dynamic' web app

All the other sites that Apache is serving up are fine, there's just
the one that is slow.  It seems like it would be quicker and easier to
write a spider that crawls the site and saves the static files
somewhere that Apache can serve them.

> apache config - same as before, just serve on a non-80 port (single
> machine setup) or point DNS at the squid server (multi machine setup)
>
>
> (use a recent 2.6 for accelerating)
> squid: (assuming apache is on 1.2.3.4 port 81
>
>    http_port 80 vhost defaultsite=www.example.com
>    # www.example.com for broken clients who try to
>    # GET without saying which domain.
>
>    cache_peer 1.2.3.4 81 0 no-query no-digest no-netdb-exchange
> originserver name=web
>
>    # if you can list the domains accelerated easily
>    # you may also want an ACL pre-filtering the domains
>    acl accelSites srcdomain www.example.com
>    cache_peer_access allow accelSites
>    http_access allow accelSites

Anyway, thank you for taking the time to help.  It's appreciated.

Ed

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux