On Wed, Oct 19, 2011 at 2:10 AM, Andrew Beverley wrote: > >> >> acl A dstdomain 192.168.235.136 >> acl B urlpath_regex /splash.html /check.html >> http_access allow A B > > The above 2 rules do not appear to be used? Well the idea was to use these to be sure I can GET those 2 links used for check and splash and not get stuck in a loop. >> acl clicked_login_url url_regex -i http://192.168.235.136/check.html >> http_access allow clicked_login_url session_LOGIN > > This all looks correct to me. However, I would run a test yourself from > a shell. Just run the session helper yourself from a command prompt and > enter the IP address of your computer to test it: > > /usr/local/squid3.2/libexec/ext_session_acl -T 30 -b /usr/local/squid3.2/lib/session.db -a > > Then type: > > 10 192.168.0.1 [change IP address as appropriate] > > You should either get OK or ERR in response 10 192.168.235.136 10 ERR message="No session available" It does seem to RANDOMLY work. It's like this: * squid is started, I am not allowed any access although I hit the check link ** I restart squid, I am allowed access and after a few seconds denied * I restart squid, again no access. ** Restart again, access allowed and after a few seconds denied The odd thing is that I can never make squid accurately reproduce the errors. I just toyed around with the parameters in squid.conf and after reverted to the old ones, and it's just stuck in an infinite loop trying to GET splash.php. > I suspect that the actual problem is a sync problem when running > multiple session helpers (they cache the database individually). This > problem is fixed with an upgrade to a newer Berkeley DB version in > version 1.2 of the session helper, currently waiting acceptance into > trunk. In the meantime the patch is available here: > > http://www3.us.squid-cache.org/mail-archive/squid-dev/201110/0116.html > > Andy I'm having trouble applying the patch. Can't I just recompile another build?