Search squid archive

Re: url_rewrite_timeout

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi


Of course my browser used my proxy.


I try to request www.perdu.com (that's a simple website with only http).


Indeed when I look on the access.log I have the following trace :


1533023517.502   4990 10.1.0.39 NONE_ABORTED/000 0 GET http://perdu.com/ - HIER_NONE/- -

So seems like the connection is abort, but this should not stop my rewriter ?

And if my rewritter reply OK after the 10 seconds timeout the page is well displayed in my browser. I was expecting to receive a 500 error.




On 31/07/2018 09:43, Amos Jeffries wrote:
On 31/07/18 19:35, ygirardin wrote:
Hi,


I'm trying to use the new squid4 directive url_rewrite_timeout.


In order to make sure it works my rewritter is blocking to enable the
timeout.

But nothing happen, I thought my browser will receive a 500 error with
the following configuration but nope nothing happend and no trace in
cache.log.


Here is my configuration :


url_rewrite_timeout 10 seconds on_timeout=fail


What am i doing wrong ?

What URL are you requesting?

What shows up in access.log ?

Does the browser actually use the proxy?

Perhapse using something less likely to use non-HTTP protocols for the
fetch will show things better. Try squidclient to do the fetch.

Amos
_______________________________________________
squid-users mailing list
squid-users@xxxxxxxxxxxxxxxxxxxxx
http://lists.squid-cache.org/listinfo/squid-users

_______________________________________________
squid-users mailing list
squid-users@xxxxxxxxxxxxxxxxxxxxx
http://lists.squid-cache.org/listinfo/squid-users




[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux