Search squid archive

[Fwd: Re: Ntlm and url_regex]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 






Michael Alger a écrit :
> On Mon, Oct 22, 2007 at 11:44:17AM +0200, Alexandre Mackow wrote:
>> Squid is running and perfectly works with an authentification
>> based on AD (Ntlm) ..
>> So for my users who are not fully authorized, i create an acl
>> "acl sites_ok url_regex "/etc/squid/sitesok.list"
>> http_access allow sites_ok"
>>
>> With 3 sites for evrybody....
>> The probleme that when a user is not autorized with ntlm and go to
>> a page authorized with url_regex, when a link is present on the
>> page (I think), an authentification windows open ...and the user
>> have to click to pass the message.....
> 
> When a browser accesses a site, it will download all resources
> required to display it. The main ones to look for are style sheets,
> scripts, and embedded images and other types of media. You might
> find the "Firebug" extension for Firefox is useful for identifying
> all the things your browser is accessing in order to render a page.
> 
> You will need to permit unauthenticated access to every resource on
> the page(s) you want to allow access to in order for a user to be
> able to browse it without being prompted to authenticate.
> 
> Note that it's perfectly legitimate for some of the resources used
> by a page to be located on a different server, and even a completely
> unrelated domain. A good example is advertising scripts, which
> typically live on an adhost's servers (e.g. doubleclick.net).
> 
> It's also possible that the browser is "pre-fetching" pages linked
> to by the site, by following normal hyperlinks. Most browsers don't
> do this "out of the box" though, only with the help of "internet
> accelerator" type software. So while this is posible, the most
> likely cause of the authentication popup is that the sites you're
> allowing access to include references to media or scripts located on
> other servers which you aren't allowing access to.
> 
> AFAIK, there's no way in squid to tell it to allow a site and
> "everything on it". If working out what external resources the site
> requires and permitting access to them is not an option (e.g. it's
> outside of your control or changes frequently), you might be able to use
> the "Referer" header from the client's request in an ACL -- but if you
> can, you make it possible for anyone who's clever to access any site
> without authenticating (the client can send whatever Referer header it
> wants), which may be unacceptable.
> 
> A completely different option could be to use a tool to create a
> local "mirror" of the site(s) you want to allow access to. Such a
> tool would pull in all resources required to render the page and
> store them on a local server. It would also rewrite the original
> page to reference the local copies. Then you just need to permit
> unauthenticated users access to your local mirror.
> 

Ok thanks for your help....

Regards.



begin:vcard
fn:Alexandre Mackow
n:Mackow;Alexandre
org:Groupe Millet;OSI
adr;dom:;;Bretignolles;Bressuire;;79300
email;internet:amackow@xxxxxxxxxxxxxxxxx
title:Service OSI
tel;work:05 49 74 55 67
x-mozilla-html:FALSE
version:2.1
end:vcard


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux