Search squid archive

Re: WARNING: You should probably remove 'www.somewebsite.com' from the ACL named 'blacklist'

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 13/10/11 22:25, devadmin wrote:
Amos, thanks for the response.

there are no duplicates I have allready sorted and cleared any dupes
there were. with cat and sed.

here is a sample of the blacklist format im using

------------------
www.example.com
example.com
------------------

Now I would not consider those "duplicates" because if I dont specify
www.example.com they can still get into the site even if example.com is
being blocked.

Question, am i being silly by doing this? Can I simply use a wildcard
instead?

Yes:

  .example.com



Final statement:

I got the errors before I added the second www listings to the file
so It happens whether or not they are "duplicates"

Depends on your squid version what they detect. Some older Squid can complain about the combo the wrong way around. Or detect example.com and www.example.com overlap ignoring the absence of the wildcard. But the warning only happens when one of the two is dropped.


On Thu, 2011-10-13 at 20:55 +1300, Amos Jeffries wrote:
On 13/10/11 19:07, devadmin wrote:
I have a blacklist of about 1 million domains when I reload squid I get
about, a million of these error messages. Should I do something to
correct this error message? because squid seems to be functioning
just fine otherwise, it is indeed blocking the sites as I have
configured it to do.

The message already answers your question.

To silence the warnings sort and de-duplicate the list before loading it
into Squid.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE9 or 3.1.15
  Beta testers wanted for 3.2.0.12


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux