Search squid archive

Re: problem with virtualy simple HTML page proxying

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



>> Zoran Milenkovic wrote:
>>> Hi!
>>>
>>> I have Squid Cache: Version 2.6.STABLE13 installed on Fedora Core 6
>>> Linux
>>> 2.6.18-1.2798.fc6. Squid is configured in very usual manner, internal
>>> hosts are only allowed to reach Internet using proxy server, and the
>>> rest
>>> of squid configuration is left by default. Everything works perfectly,
>>> but only one thing DOESN'T work! Users cannot reach one specific HTML
>>> page:
>>> http://www.nbs.yu/internet/cirilica/scripts/bankeMenjaci/index.html
>>>
>>> In the squid access.log nothing seems to be wrong:
>>> 1201958794.061    300 192.168.0.24 TCP_MISS/200 314 GET
>>> http://www.nbs.yu/internet/cirilica/scripts/bankeMenjaci/index.html -
>>> DIRECT/194.79.41.40 text/html
>>>
>>> ...but instead of page, users get an empty page from proxy server!
>>> Without proxy, the page is getting displayed correctly.
>>>
>>> I would like to add that's the case only with that particular page on
>>> this site (www.nbs.yu) Everything else goes fine. Also, I was able to
>>> reproduce this error with other squid servers.
>>>
>>> At the other side, I would like my users to reach web contents through
>>> squid without any exeptions, if possible. So I would like to ask is
>>> this
>>> problem with squid, this particular webpage, or something else? And how
>>> could I solve this problem?
>>
>> I have tested this here (through a squid-3) and seen this behaviour on
>> that page.
>> From the headers it looks like there are at least two servers providing
>> that site. When that page is retrieved directly from Apache it returns a
>> custom 404 page etc, etc.
>>
>> There is also something identifying itself as an "OpenCMS/5.0.0" server
>> which returns a set of 404 headers without a page body.
>>
>> It is likely that second result got into your cache somehow.
>> You can temporarily clear the results with a
>> squidclient -m PURGE
>> http://www.nbs.yu/internet/cirilica/scripts/bankeMenjaci/index.html
>>
>> But there is nothing you can do long-term as a visitor until the admin
>> of
>> nbs.yu fixes their CMS.
>>
>> Amos
>> --
>> Please use Squid 2.6STABLE17+ or 3.0STABLE1+
>> There are serious security advisories out on all earlier releases.
>
> Dear Amos,
>
> Thanks for the quick reply!
>
> I will try to contact web admin of the nbs.yu website and describe the
> problem.
>
> OT: Could you please point me out the literature about tools and/or
> methods
> you have used to find out what is going on with this website/problem? Of
> course, if there is something like that and if it's possible. Thanks
> again!

I loaded the page in Firefox to see if it was showing. I got the empty page!

Then I used squidclient, to view the full HTML transactions in progress.

# To see what squid was giving out:
squidclient http://www.nbs.yu/......

  -- gave out the empty page from my cache so purging the locally cached
data i tried again

squidclient -m PURGE http://www.nbs.yu/......
squidclient http://www.nbs.yu/......

  -- and got a proper page from Apache.


# To see what the site was giving
squidclient -h www.nbs.yu -p 80 http://www.nbs.yu/......

When it worked first time (saying Apache page) I tried it again to be sure
and saw the empty page (saying OpenCms page). Repeating another 4 times
always brought up the empty page from OpenCms.


The output of squidclient is:
  Request headers clients sends to server/squid
  <empty line>
  Reponse headers from the server/squid to client
  <empty line>
  page data (if any)


Amos



[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux