Search squid archive

RE: any work arounds for bug 2176

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Reposted for info to the list, without the attachments that cause the list to bounce the message

-----Original Message-----
From: Bill Allison 
Sent: 18 December 2009 09:43
To: 'Amos Jeffries'; Brett Lymn
Cc: squid-users@xxxxxxxxxxxxxxx
Subject: RE:  any work arounds for bug 2176

"I  get the same error as Brett only when the body of the post is much greater than that which causes the post to fail."

Correction after further testing...

I  get the same error as Brett only when the body of the post is much greater than that which causes the post to fail, and even then only sometimes, in repeated tests with the same file being uploaded. 

Other times the browser reports "The connection was reset" and tcpdump shows that the proxy sent a FIN to the server then to the client in response to the second 401 from the server. THe server closes the connection but the client continues sending a POST and the proxy then sends the client a string of RSTs. 

For info "Invalid Verb" is issued by http.sys in IIS 6.0, in response to receiving a header that is not strictly rfc-compliant (including truncated).

Attached as requested is my squid.conf and tcpdumps of the Invalid Verb and RST failure cases.

Unlike Brett I'm very much a novice C coder but I'm perfectly happy to patch, compile and test if it helps generate a solution.

Regards
Bill A.

-----Original Message-----
From: Amos Jeffries [mailto:squid3@xxxxxxxxxxxxx]
Sent: 17 December 2009 09:10
To: Brett Lymn
Cc: Bill Allison; squid-users@xxxxxxxxxxxxxxx
Subject: Re:  any work arounds for bug 2176

Brett Lymn wrote:
> On Wed, Dec 16, 2009 at 07:57:21AM -0600, Bill Allison wrote:
>> Sorry - that was misleading. I've had 
>> persistent_connection_after_error set on throughout my testing.
> 
> I don't have that in my config file at all so I would guess it is at 
> the default.
> 

Which is off. Now I'm confused.

>> I  get the same error as Brett only when the body of the post is much greater than that which causes the post to fail.
>>
> 
> I only tried a large-ish document.  We did observe the same strange 
> limit that Bill has seen when we tested without the patch applied, 
> under a certain "magic" threshold the document would upload - the 
> threshold seemed to be around the 50k mark, over that threshold we 
> would just get popups.
> 
>> I'd like to correlate network traces with debug output and would 
>> appreciate suggestions as to which debug_options would include all 
>> possibly relevant info
>>
> 
> I am a C coder and may have some time to do some debugging on this 
> between christmas and new year so, Amos, if you have any thoughts or 
> hints as to where to go looking I can certainly have a stab at it.
> 

Thank you. Any help at all would be great.

I *think* the relevant code is off src/client_side_reply.cc, but what to look for is where I'm currently stuck. The keep_alive values resolved things for you Brett but not Bill.


The variable nature of the threshold looks like some timing between actions triggering the bug vs the rate at which Squid is sucking the request in.

AFAIK popups only occur when the client gets sent two re-auth challenges. Which in the un-patched Squid was caused by the first half-authenticated link being closed by Squid before auth could complete. Then the second link being challenged for more auth would cause popup.

I think the next step is to find out what the difference between your two setups is exactly:
  * squid.conf
  * headers between Squid and the POSTing app.
  * headers between Squid and the web server.

Particularly in what reply headers are going back.  That should give us a little more of an idea what areas to look at.

If as you say the patch solved the issue but you saw the same thing earlier. Then I suspects it's probably a squid.conf detail being overlooked.

Amos
--
Please be using
   Current Stable Squid 2.7.STABLE7 or 3.0.STABLE20
   Current Beta Squid 3.1.0.15

--
This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean.



[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux