You can always create a black hole route for those sites on your firewall or routers. Previous job I had the same request to block certain sites. Did not have a proxy server setup at that time so took the quick and dirty approach. On Fri, 2003-07-18 at 23:05, Thom Paine wrote: > On Fri, 2003-07-18 at 18:03, Florin Andrei wrote: > > So what is your actual question? :-) > > > > If you're looking for a way to track down the users' cybertrail, get > > MySQL or PostgreSQL to suck in the old (rotated) Squid logs (separate > > the fields of the logs into cells in the SQL tables when importing > > them), then perform searches using SQL queries. Well, write a PHB... > > erm, sorry ;-) a PHP interface over the database. > > > > You can identify users by IP address, or by forcing them to login to the > > proxy and use their usernames as keys in the search. > > > > If you rotate logs daily, then the searches are almost real-time. > > > > Not hard to do, fairly fast and very powerful. > > > > Well that is beyond me. I'm not into php programming and stuff like > that. I hate coding, but I'm starting to like a bit of bash scripting to > help with trivial stuff. I guess that's how it expands. > > I have the sarg program working to track where users go, but I'd like a > little assistance blocking access to a list of sites. The Executive > Director does not want people going to hotmail, yahoomail, and webshots > specifically. > > So I need help getting squid tuned with some of the more advanced > features. > > Thanks, > > -=/>Thom -- Scot L. Harris <webid@xxxxxxxxxx> -- Shrike-list mailing list Shrike-list@xxxxxxxxxx https://www.redhat.com/mailman/listinfo/shrike-list