To lose the stress on the DB you can use a custom format as Amos suggested but.. I think that when you will define and write what you want to log exactly you will get what you need and want. The general squid access log is pretty lose and I believe that with these days hardware the difference will only be seen on systems with thousands or millions of clients requests. If this is a small place it’s not required. All The Bests, Eliezer ---- Eliezer Croitoru From: Alex K <rightkicktech@xxxxxxxxx> +++ Including list +++ Hi Eliezer, I have used the following lines to instruct squid to log at mariadb: Through testing it seems that sometimes squid is not logging anything. I don't know why. After a restart it seems to unblock and write to DB. The access_log table is currently InnoDB and I am wondering if MyISAM will behave better. I would prefer if I could have real time access log. My scenario is that when a user disconnects from squid, an aggregated report of the sites that the user browsed will be available under some web portal where the user has access. Usually there will be up to 20 users connected concurrently so I have to check if this approach is scalable. If this approach is not stable then I might go with log parsing (perhaps logstash or some custom parser) which will parse and generate an aggregated report once per hour or day. Is there a way I format the log and pipe to DB only some interesting fields in order to lessen the stress to DB? On Sun, May 13, 2018 at 1:25 AM, Eliezer Croitoru <eliezer@xxxxxxxxxxxx> wrote:
|
_______________________________________________ squid-users mailing list squid-users@xxxxxxxxxxxxxxxxxxxxx http://lists.squid-cache.org/listinfo/squid-users