Search squid archive

Looking for web usage reporting solution

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I am looking for a web usage reporting solution that can run via sniffing or from a mirror port on a switch. I envision this solution would simply log each URL request it sees and allow reports to be generated on web sites that internal users have gone to.  I've searched high and low, but cannot find a "ready-made" solution, so I'm looking to put it together myself.

Most people/posts suggest using squid/squidgard/dan's guardian, but it appears to me that is only an inline solution, and I would prefer a sniffing solution for safety (if machine crashes, it doesn't take down Internet). In that sense, it would work a lot like websense, but without the blocking, only reporting.

>From a high-level pseudo-code standpoint, it would simply sniff all traffic, and when it sees a packet requesting a webpage, it parses it and dumps these results into a database:

-Date
-Time
-Source IP
-Dest IP
-URL requested
-FQDN portion of web request - IE: if request was for 
 http://www.microsoft.com/windows/server/2003, it records only 
 www.microsoft.com here
-domain portion of web request - only microsoft.com in above example

Using this data, I can then produce reports for the client on who went where when.... Personally, I thought this would be a great program for open source, but I can't find anything like this already out there!!! It seems like kind of a mix between Squid, NTOP and Snort...

Thanks for any thoughts on this project!


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux