On 19/03/2013 12:57 a.m., James Harper wrote:
Say I have a squid reverse proxy with https enabled on it at https://apps.example.com. This serves a number of apps including: /owa - outlook web access /rpc - ms terminal server gateway /intranet /bugtracker /svn - svn anon browser access /procedures These are spread across a bunch of completely different servers (some linux, some windows) and works really really well. It has been decided that some of the individual applications are not secure enough. /owa, /rpc, and /bugtracker are fine, while /intranet, /procedures, and /svn are not. I have set up acls to deny external access to the insecure apps but now want to put some front end security on them such that when a user first tries to access one with a browser they are redirected and required to sign in to a web forms based page. The idea I have for this is: . create an sqlite database in /var/run or some other throwaway location
NP: sqlite is know to be terribly slow for this type of thing. You may want to reconsider the exact DB type there.
. redirect users using deny_info to the sign in page (php) . on successful authentication, set a cookie (some random string eg md5 hash of username, password, and time) and create a corresponding entry in the database then redirect user to original page (only possible with squid 3.2.x I believe...)
No. Possible with older Squid as well. Pass the original URL to the splash page as a query-string parameter using %s.
. create an external acl helper that is passed in the request header corresponding to the cookie, decodes the cookie value from the header, and looks up the entry in the database (and maybe timestamp last access). If present, report OK . create a cron job nightly (or hourly or whatever) to delete stale records from the database to keep the size reasonable
Why not delete stale entries immediately as the helper locates them as being stale in the DB? that speeds up all later fetches which would have found it and had to re-test. The number of DB entries is then also never more than your current user load at any point - as opposed to the total unique loading across the entire day so far.
The cookie here only serves as a lookup into the database, and I believe will be supplied by the browser on any user request.
Squid has a bundled session helper which supports both passive and active (login) sessions. I suggest taking a good look through its documentation and considering whether you can use it instead. Doing so will keep all the session criteria private to the server instead of using Cookie to send out details an attacker can capture and break in with.
http://wiki.squid-cache.org/ConfigExamples/Portal/Splash Amos