Search squid archive

Re: Question regaring cache.log in newer versions of squid, where are my logs??

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Fri, 24 May 2013 22:52:27 +1200
Amos Jeffries <squid3@xxxxxxxxxxxxx> wrote:

> On 24/05/2013 9:32 p.m., Squidblacklist wrote:
> > Yeah my log files are in /var/log/squid3/cache.log but it doesnt
> > matter where my log files are, its what squid is, or actually, isnt
> > doing that has me unhappy.
> >
> >   Lets say normally when I would build a new lists I would have the
> >   following in my blacklists
> >
> > .somesite.com
> > porn.somsite.com
> > othersite.com
> > .othersite.com
> >
> >
> > Ok, squid would throw enough errors where I could sort the log file
> > and have a nice list of each and EVERY error in my list.
> >
> > Now all squid does is poop out a single error, and thats it, thats
> > all it logs, it ignores the rest of the erronious entries
> > completely. Unlike the previous versions of squid that have shipped
> > with debian, 3.1.6 and lower, they gave an log entry for each and
> > every error in an external acl file, this new version as I said,
> > DOES NOT.
> >
> > It does a single entry and thats it. Im just curious as to why.
> >
> > Heres an example.
> >
> >
> > root@galileo:/etc/squid3# squid3 -k reconfigure
> > 2013/05/23 22:30:17| ERROR: '.web-cam-sex.webgidsje.nl' is a
> > subdomain of '.webgidsje.nl' 2013/05/23 22:30:17| ERROR: because of
> > this '.webgidsje.nl' is ignored to keep splay tree searching
> > predictable 2013/05/23 22:30:17| ERROR: You should remove
> > '.webgidsje.nl' from the ACL named 'test' FATAL: Bungled squid.conf
> > line 47: acl test dstdomain "/etc/squid3/squid-porn.acl" Squid
> > Cache (Version 3.1.20): Terminated abnormally. CPU Usage: 43.199
> > seconds = 42.711 user + 0.488 sys Maximum Resident Size: 155056 KB
> > Page faults with physical i/o: 0
> >
> > root@galileo:/etc/squid3# sed  -i '/.webgidsje.nl/d' squid-porn.acl
> > root@galileo:/etc/squid3# squid3 -k reconfigure
> > 2013/05/23 22:36:20| ERROR: '.tydeue.www2.prexon.nl' is a subdomain
> > of '.www2.prexon.nl' 2013/05/23 22:36:20| ERROR: because of this
> > '.www2.prexon.nl' is ignored to keep splay tree searching
> > predictable 2013/05/23 22:36:20| ERROR: You should remove
> > '.www2.prexon.nl' from the ACL named 'test' FATAL: Bungled
> > squid.conf line 47: acl test dstdomain "/etc/squid3/squid-porn.acl"
> > Squid Cache (Version 3.1.20): Terminated abnormally. CPU Usage:
> > 43.515 seconds = 43.019 user + 0.496 sys Maximum Resident Size:
> > 156896 KB Page faults with physical i/o: 0
> >
> > root@galileo:/etc/squid3# sed  -i '/.prexon.nl/d' squid-porn.acl
> > root@galileo:/etc/squid3# squid3 -k reconfigure
> > 2013/05/23 22:39:33| ERROR: '.danx.wwwpuntocom.com' is a subdomain
> > of '.wwwpuntocom.com' 2013/05/23 22:39:33| ERROR: because of this
> > '.wwwpuntocom.com' is ignored to keep splay tree searching
> > predictable 2013/05/23 22:39:33| ERROR: You should remove
> > '.wwwpuntocom.com' from the ACL named 'test' FATAL: Bungled
> > squid.conf line 47: acl test dstdomain "/etc/squid3/squid-porn.acl"
> > Squid Cache (Version 3.1.20): Terminated abnormally. CPU Usage:
> > 43.975 seconds = 43.455 user + 0.520 sys
> >
> >
> > And so forth, the older version 3.1.6 squid would not do this, it
> > would log ALL the errors in an external acl, this new version does
> > not, it logs ONE error. and then gives up.
> >
> > Is there a way to make it more verbose? Or to make it proceed
> > without stopping on the first error?
> 
> Squid is not so much halting on the first prblem, but halting on the 
> non-recoverable errors. You are just lucky enough not to have any 
> recoverable WARNINGS showing up (they might under -k parse).
> 
> The diffrence between 3.1.6 and 3.1.20 was that we identified that
> the above two ERROR cases resulted in security holes remaining in the
> loaded config. For example ignoring the shorter of those two as done
> by 3.1.6 would result in a large amount of "listed" entries being
> wrongly dropped from the list - and live traffic being accepted which
> should have been dropped. The correct fix is to drop the longer of
> the two, which that version of Squid cannot do very easily, so we
> made it halt and require the admin to make changes.
> 

> FWIW: I've gone back over all this logic in 3.3 and updated it to be
> a lot smarter, ignoring duplicates again, etc, etc.
>   3.3 will still halt if the case is one where the must-remove entry
> has been loaded first, but the resolution is more often automated now.
> 
> 
> >
> > I mean its not critical that I have the candy I want, that was taken
> > from me, I can always use an older version from Debian Squeeze repos
> > that does what I want it to do, I just wanted to know what gives?
> 
> Or you can use the squid3 package from the Sid repository which will 
> give you better advice than 3.1 was able to.
> 
> Amos
> 


Yeah sorry if that was shrewd , yeah the reason I enjoy the long list
of errors, is its easy for me to sed/awk/grep that cache.log into a
single list of urls then pop one command to remove them all from the
lists, so I was basically using that as part of my process for testing
and illiminating errors in the blacklists. Ill give that -k parse a
spin and see how that works, if not, Ill just use a 3.1.6 on my testing
box for now. 

Thank you for tolerating me! :P


-
Signed,

Fix Nichols

http://www.squidblacklist.org




[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux