Mlib wrote:
Squid noob here.
Setting up a language lab. I need to ALLOW only 6 web sites.
2 of the 6 are complete domains - that part I have working.
4 of the 6 are URLs (i.e. only part of a domain) - that part I need help
with.
Here's my squid.conf (once again - it's working for the full domains)
visible_hostname mlibrary
acl all src 0.0.0.0/0.0.0.0
acl localnet src 192.168.100.0/24
acl whitelist dstdomain "/etc/squid/whitelist"
http_access allow localnet whitelist
http_port 8080
Here's the whitelist
.elllo.org
.montroselibrary.org
I need to allow the following 4 urls (and just those pages, nothing more in
those domains).
http://rlnvault.com/rln09/shows/spanish/coffee-break-spanish/
http://www.bbc.co.uk/languages/
http://www.eslpod.com/website/index_new.html
http://www.mangolanguages.com/lesson
Reading (*gasp*) the wiki at squid-cache.org, it looks like url_regex will
work, I just don't understand how to add it to my squid.conf like I did with
dstdomain.
acl something url_regex
^http://rlnvault\.com/rln09/shows/spanish/coffee-break-spanish/
Thats one line. Repeat as necessary for each URL.
Then its up to you how you structure the http_access lists to make it do
what you want. The wiki section SquidFaq/SquidAcls on common mistakes
covers how http_access works.
Some notes:
* each dot '.' has been turned into '\.' for regex. And a '^' anchor
at the start of the http:// to prevent bypass hacks being used.
* that URLs ending in '/' often do not retrieve the actual page but
only a redirection thing which will send to the browser to whatever
index/home page actually exists in that directory.
* you may end the pattern with a '$' to prevent sub-objects inside
those directory '/' URLs being retrieved. But I think you want them to
display.
* stuff like images and display CSS are often found at different URLs.
You may find that the above patterns show a strange layout of the page
text and nothing else.
Amos
--
Please be using
Current Stable Squid 2.7.STABLE7 or 3.0.STABLE19
Current Beta Squid 3.1.0.14