Search squid archive

Re: .com extension blocking causing blocking of redirecting URL's

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Thanks for your inputs... let me check that regex... or i will go with OpenDNS as Luis mentioned... i already started reviewing openDNS before putting their shoes...

Any -ve with these openDNS people???


-
--
---
Always try to find truth!!!

------------***---------------***--------------***------------

Its always nice to know that people with no understanding of technologies want to evaluate technical professionals based on their own lack of knowledge

------------***---------------***--------------***------------


--- On Sat, 3/28/09, Shekharsahab14 <shekharsahab14@xxxxxxxxx> wrote:

> From: Shekharsahab14 <shekharsahab14@xxxxxxxxx>
> Subject: Re:  .com extension blocking causing blocking of redirecting URL's
> To: "Luis Daniel Lucio Quiroz" <luis.daniel..lucio@xxxxxxxxx>
> Cc: "squid-users@xxxxxxxxxxxxxxx" <squid-users@xxxxxxxxxxxxxxx>, "Truth Seeker" <truth_seeker_3535@xxxxxxxxx>, "Marcus Kool" <marcus.kool@xxxxxxxxxxxxxxx>
> Date: Saturday, March 28, 2009, 10:57 PM
> Better to take help of open DNS
> 
> Sent from my iPhone
> 
> On 28-Mar-09, at 10:56 PM, Luis Daniel Lucio Quiroz <luis.daniel.lucio@xxxxxxxxx
> 
>  > wrote:
> 
> > You should set an expresión for that URL. Use regexp
> like this
> > ^http://.......sites=.*\.com
> > I think
> > On Saturday 28 March 2009 09:14:26 Truth Seeker
> wrote:
> >> Dear Marcus,
> >>
> >> Thanks for your reply... But its not working for
> me. The thing is  
> >> my acl
> >> will not block www.example.com. it will only
> block
> >> www.example.com/something.com, because i am using
> urlpath_regex  
> >> instead of
> >> url_regex in the acl declaration.
> >>
> >> Then i tried your regex also, but its not working
> for my problem.
> >>
> >> My situation is,
> >>
> >> sensex..com is working, but when this site redirect
> to
> >> http://landing.domainsponsor.com/oplatum.plmna/sites=www.someother.com
> 
> >>  , it
> >> is not working, because this URL path is ending
> with .com
> >>
> >> Is there any workaround for this...???
> >>
> >> Hope now my Q is more clear...
> >>
> >>
> >>
> >> -
> >> --
> >> ---
> >> Always try to find truth!!!
> >>
> >>
> ------------***---------------***--------------***------------
> >>
> >> Its always nice to know that people with no
> understanding of  
> >> technologies
> >> want to evaluate technical professionals based on
> their own lack of
> >> knowledge
> >>
> >>
> ------------***---------------***--------------***------------
> >>
> >> --- On Sat, 3/28/09, Marcus Kool <marcus.kool@xxxxxxxxxxxxxxx>
> wrote:
> >>> From: Marcus Kool <marcus.kool@xxxxxxxxxxxxxxx>
> >>> Subject: Re:  .com extension
> blocking causing  
> >>> blocking of
> >>> redirecting URL's To: "Truth Seeker" <truth_seeker_3535@xxxxxxxxx>
> >>> Cc: "Squid maillist" <squid-users@xxxxxxxxxxxxxxx>
> >>> Date: Saturday, March 28, 2009, 1:53 PM
> >>> The ACL blocks URLs that end with
> >>> .com
> >>> i.e. it blocks a URL which is www.example.com
> while it does
> >>> not block www.example.com/index.html
> >>>
> >>> If you change the patterns to include a slash
> you are
> >>> fine.
> >>> The slash must prevent that domains with .com
> are matched.
> >>> e.g.
> >>> ..*\.com$ 
> becomes   .*\...*/.*\.com$
> >>>
> >>> Marcus
> >>>
> >>> Truth Seeker wrote:
> >>>> Hi Techies,
> >>>>
> >>>> I have an acl which blocks download of
> file with
> >>>
> >>> harmful extension's. like .exe, .bat, .com,
> etc. This rule
> >>> is working fine. the following is the details
> of it;
> >>>
> >>>> ### Blocking of Dangerous extensions to
> certain
> >>>
> >>> groups
> >>>
> >>>> acl dangerous_extension urlpath_regex -i
> >>>
> >>> "/etc/squid/include-files/dangerous_ext
> >>>
> >>>> ension.squid"
> >>>> http_access allow vip_acl
> dangerous_extension
> >>>> http_access allow power_acl
> dangerous_extension
> >>>> http_access allow ultimate_acl
> dangerous_extension
> >>>> http_access allow download_surfers_acl
> >>>
> >>> dangerous_extension
> >>>
> >>>> http_access deny dangerous_extension
> >>>> deny_info ERR_DANGEROUS_ESTENSIONS
> >>>
> >>> dangerous_extension
> >>>
> >>>> # cat
> >>>
> >>>
> /etc/squid/include-files/dangerous_extension.squid
> >>> ..*\.exe$
> >>>
> >>>> ..*\.com$
> >>>> ..*\.vb$
> >>>> ..*\.vbs$
> >>>> ..*\.vbe$
> >>>> ..*\.cmd$
> >>>> ..*\.bat$
> >>>> ..*\.ws$

> >>>> ..*\.wsf$
> >>>> ..*\.scr$
> >>>> ..*\.shs$
> >>>> ..*\.pif$
> >>>> ..*\.hta$
> >>>> ..*\.jar$
> >>>> ..*\..js$
> >>>> ...*\.jse$
> >>>> ..*\.lnk$
> >>>> ..*\.mov$
> >>>> ..*\.3gp$
> >>>> ....*\.avi$
> >>>> ..*\.rar$
> >>>> ..*\.zip$
> >>>>
> >>>>
> >>>>
> >>>> If there is a site which redirect traffic
> to another
> >>>
> >>> .com site, will cause to trigger the above
> rule, which will
> >>> result in failure of a legitimate request. How
> can i do a
> >>> workaround on this issue???
> >>>
> >>>> Thanks in Advance...
> >>>>
> >>>>
> >>>> -
> >>>> --
> >>>> ---
> >>>> Always try to find truth!!!
> >>>
> >>>
> ------------***---------------***--------------***------------
> >>>
> >>>> Its always nice to know that people with
> no
> >>>
> >>> understanding of technologies want to evaluate
> technical
> >>> professionals based on their own lack of
> knowledge
> >>>
> >>>
> >>>
> ------------***---------------***--------------***------------
> >>>
> >>>>
> >
> >
> 


      



[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux