Usually that stuff is placed into a text file (so to speak) and then squid or alternative programs parse the logs. I know that SCO Unix server has this capability. Surprisingly, though, depending on traffic, this log file can be rather large. While in college we did a C program that could parse the logs to find only symbolic domains, list the top 20, and give you quite a bit of data. As per something off the shelf or something on board Linux, you've got me. :-) Hope this helps. http://www.covenantdata.com Where data becomes information! Robert Williams Programmer / Web Developer / Network Administrator Covenant Data Systems, Inc. http://www.covenantdata.com rwilliams@xxxxxxxxxxxxxxxx -----Original Message----- From: redhat-list-bounces@xxxxxxxxxx [mailto:redhat-list-bounces@xxxxxxxxxx] On Behalf Of Scott Sharkey Sent: Tuesday, June 28, 2005 12:40 PM To: redhat-list@xxxxxxxxxx Subject: "Proxy" server needed - Web Access Monitor? Hi All, I have a need for a service or software that will monitor and track outbound HTTP (web) accesses, and report what pages are being visited. I know that Squid theoretically can do this, but it seems like overkill for this application. The user could care less about caching - they just want to monitor what URL's their staff is visiting. IS squid the best option, or is there something else that would do a better job? Part of my concern is that the machine they have to run this on does not have a huge amount of memory, and I understand Squid can be a memory hog. Thanks!!! -Scott -- redhat-list mailing list unsubscribe mailto:redhat-list-request@xxxxxxxxxx?subject=unsubscribe https://www.redhat.com/mailman/listinfo/redhat-list -- redhat-list mailing list unsubscribe mailto:redhat-list-request@xxxxxxxxxx?subject=unsubscribe https://www.redhat.com/mailman/listinfo/redhat-list