Search squid archive

Re: Strange problem accessing http://Bloomberg.com

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



> Hello,
>
> I ma having a very bizarre problem and I am wondering if anyone here can
> shed some light on it.
> Our internal users are accessing the Internet via a squid v2.6-STABLE9
> proxy using a proxy.pac file.
> Their browsers (corporate dictates Internet Explorer) are configured to
> "Automatically detect proxy settings".
>
> When I open the page http://bloomberg.com using the above settings, the
> page mostly loads but the browser locks up and needs to be killed.
>
> If I configure the browser to use a statically configured proxy and
> port, then the page loads fine.
>
> The
>

<elided long trace for brevity>

>
> At this point, the page has loaded and is working fine.
> Note the TCP_DENIED 4 lines from the bottom of the second test.  This
> looks like a bad URL due to some bad copy-pasting of code by the
> webmasters of Bloomberg.
> As you can see, the additional lines in the session that works are
> nothing special (except for that DENIED entry).
>
> Any idea as to what could be going on here?
> My gut tells me that the "fix" lies in the IE configuration but I also
> think there should be some kind of work-around possible in squid.

I'm minded to suspect there is something in the .PAC file breaking under
that fubar URL.

Being javascript it's susceptible to $url with quote ' and " charecters in
their strings if the browser is broken enough to pass them unencoded ('
seems not to encode easily).

I've also seen Squid helpers which barf on miss-matched quoting in similar
ways (usually SQL-injection holes).

Amos



[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux