Search squid archive

Re: .com extension blocking causing blocking of redirecting URL's

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Marcus,

Thanks for the info... i just want to confirm whether its possible or not... \

Once again thx a lot.
-
--
---
Always try to find truth!!!

------------***---------------***--------------***------------

Its always nice to know that people with no understanding of technologies want to evaluate technical professionals based on their own lack of knowledge

------------***---------------***--------------***------------


--- On Sat, 3/28/09, Marcus Kool <marcus.kool@xxxxxxxxxxxxxxx> wrote:

> From: Marcus Kool <marcus.kool@xxxxxxxxxxxxxxx>
> Subject: Re:  .com extension blocking causing blocking of redirecting URL's
> To: "Truth Seeker" <truth_seeker_3535@xxxxxxxxx>
> Cc: "Squid maillist" <squid-users@xxxxxxxxxxxxxxx>
> Date: Saturday, March 28, 2009, 6:42 PM
> 
> 
> Truth Seeker wrote:
> > 
> > Dear Marcus,
> > 
> > Thanks for your reply... But its not working for me.
> The thing is my acl will not block www.example.com. it will
> only block www.example.com/something.com, because i am using
> urlpath_regex instead of url_regex in the acl declaration.
> > 
> > Then i tried your regex also, but its not working for
> my problem.
> > 
> > My situation is, 
> > 
> > sensex.com is working, but when this site redirect to
> http://landing.domainsponsor.com/oplatum.plmna/sites=www.someother.com
> , it is not working, because this URL path is ending with
> .com
> > 
> > Is there any workaround for this...???
> > 
> > Hope now my Q is more clear...
> 
> yep, the problem is more clear.. and I am afraid that there
> is no solution
> that works well using regular expressions...
> The regular expression matching for downloadable files does
> not work well
> because you may have all kinds of URL patterns for false
> positives (the
> example that you gave) and false negatives (e.g.
> www.example.com/virus.com&version=1)
> 
> Note that the filename 'virus.com' may be substituted by
> 'www.innocentname.com'
> and regular expression matching will never block it.
> 
> I suggest to keep antivirus software up-to-date and if you
> work
> in a controlled environment take the appropriate measures
> to limit the capabilities of users on PCs to install
> applications.
> 
> 
> > 
> > 
> > -
> > --
> > ---
> > Always try to find truth!!!
> > 
> >
> ------------***---------------***--------------***------------
> > 
> > Its always nice to know that people with no
> understanding of technologies want to evaluate technical
> professionals based on their own lack of knowledge
> > 
> >
> ------------***---------------***--------------***------------
> > 
> > 
> > --- On Sat, 3/28/09, Marcus Kool <marcus.kool@xxxxxxxxxxxxxxx>
> wrote:
> > 
> >> From: Marcus Kool <marcus.kool@xxxxxxxxxxxxxxx>
> >> Subject: Re:  .com extension blocking
> causing blocking of redirecting URL's
> >> To: "Truth Seeker" <truth_seeker_3535@xxxxxxxxx>
> >> Cc: "Squid maillist" <squid-users@xxxxxxxxxxxxxxx>
> >> Date: Saturday, March 28, 2009, 1:53 PM
> >> The ACL blocks URLs that end with
> >> .com
> >> i.e. it blocks a URL which is www.example.com
> while it does
> >> not block www.example.com/index.html
> >>
> >> If you change the patterns to include a slash you
> are
> >> fine.
> >> The slash must prevent that domains with .com are
> matched.
> >> e.g.
> >> ..*\.com$ 
> becomes   .*\..*/.*\.com$
> >>
> >> Marcus
> >>
> >>
> >> Truth Seeker wrote:
> >>> Hi Techies,
> >>>
> >>> I have an acl which blocks download of file
> with
> >> harmful extension's. like .exe, .bat, .com, etc.
> This rule
> >> is working fine. the following is the details of
> it;
> >>> ### Blocking of Dangerous extensions to
> certain
> >> groups
> >>> acl dangerous_extension urlpath_regex -i
> >> "/etc/squid/include-files/dangerous_ext
> >>> ension.squid"
> >>> http_access allow vip_acl dangerous_extension
> >>> http_access allow power_acl
> dangerous_extension
> >>> http_access allow ultimate_acl
> dangerous_extension
> >>> http_access allow download_surfers_acl
> >> dangerous_extension
> >>> http_access deny dangerous_extension
> >>> deny_info ERR_DANGEROUS_ESTENSIONS
> >> dangerous_extension
> >>> # cat
> >>
> /etc/squid/include-files/dangerous_extension.squid
> >> ..*\.exe$
> >>> ..*\.com$
> >>> ..*\.vb$
> >>> ..*\.vbs$
> >>> ..*\.vbe$
> >>> ..*\.cmd$
> >>> ..*\.bat$
> >>> ..*\.ws$
> >>> ..*\.wsf$
> >>> ..*\.scr$
> >>> ..*\.shs$
> >>> ..*\.pif$
> >>> ..*\.hta$
> >>> ..*\.jar$
> >>> ..*\..js$
> >>> ..*\.jse$
> >>> ..*\.lnk$
> >>> ..*\.mov$
> >>> ..*\.3gp$
> >>> ...*\.avi$
> >>> ..*\.rar$
> >>> ..*\.zip$
> >>>
> >>>
> >>>
> >>> If there is a site which redirect traffic to
> another
> >> .com site, will cause to trigger the above rule,
> which will
> >> result in failure of a legitimate request. How can
> i do a
> >> workaround on this issue???
> >>> Thanks in Advance...
> >>>
> >>>
> >>> -
> >>> --
> >>> ---
> >>> Always try to find truth!!!
> >>>
> >>>
> >>
> ------------***---------------***--------------***------------
> >>> Its always nice to know that people with no
> >> understanding of technologies want to evaluate
> technical
> >> professionals based on their own lack of
> knowledge
> >>>
> >>
> ------------***---------------***--------------***------------
> >>>
> >>>        
> >>>
> >>>
> > 
> > 
> >       
> > 
> > 
> > 
> 


      


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux