Search squid archive

Re: redirect_program and non-redirection

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



> 
> Tuc at T-B-O-H.NET wrote:
> >> Tuc at T-B-O-H.NET wrote:
> >>> Hi,
> >>>
> >>> 	I'm having an issue I'm not sure why. Unfortunately I'm
> >>> not at the site to see the problem, so debugging is a bit difficult.
> >>>
> >>> 	I have :
> >>>
> >>> redirect_program /usr/local/bin/squidintercept.pl
> >>>
> >>> 	And the program (as mentioned before) is fairly generic.
> >>> If its a "GET", if the URL ends in "/", and if they aren't in a
> >>> db, send a 302 to a webpage on my webserver. 
> >>>
> >>> 	I'm getting the GET match, I'm getting the "/" match, and
> >>> I'm getting the 302... But it seems like the browser just ignored it
> >>> and goes on its merry way...
> >>>
> >>> 	The hit that triggers it is :
> >>>
> >>> 192.168.3.3 - - [09/May/2008:07:48:01 -0400] "GET http://www.brockport.k12.ny.us/ HTTP/1.1" 302 191 "http://search.live.com/results.aspx?srch=105&FORM=IE7RE&q=brockport+central"; "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; .NET CLR 1.0.3705; .NET CLR 1.1.4322; Media Center PC 4.0)" TCP_MISS:NONE
> >>>
> >>> 	Which you see the 302, but then :
> >>>
> >>> 192.168.3.3 - - [09/May/2008:07:48:02 -0400] "GET http://dss1.siteadvisor.com/DSS/Query? HTTP/1.1" 200 1684 "-" "SiteAdvisor" TCP_MISS:DIRECT
> >>> 192.168.3.3 - - [09/May/2008:07:48:02 -0400] "GET http://www.brockport.k12.ny.us/pix/home/topLogo.gif HTTP/1.1" 304 427 "http://www.brockport.k12.ny.us/"; "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; .NET CLR 1.0.3705; .NET CLR 1.1.4322; Media Center PC 4.0)" TCP_IMS_HIT:NONE
> >>> 192.168.3.3 - - [09/May/2008:07:48:02 -0400] "GET http://www.brockport.k12.ny.us/pix/home/topSpacer.gif HTTP/1.1" 304 427 "http://www.brockport.k12.ny.us/"; "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; .NET CLR 1.0.3705; .NET CLR 1.1.4322; Media Center PC 4.0)" TCP_IMS_HIT:NONE
> >>> 192.168.3.3 - - [09/May/2008:07:48:02 -0400] "GET http://www.brockport.k12.ny.us/pix/home/intranetLink.gif HTTP/1.1" 304 427 "http://www.brockport.k12.ny.us/"; "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; .NET CLR 1.0.3705; .NET CLR 1.1.4322; Media Center PC 4.0)" TCP_IMS_HIT:NONE
> >>>
> >>> 	As if the 302 is totally ignored. Atleast before when they 
> >>> were matching I saw :
> >>>
> >>> 192.168.3.3 - - [07/May/2008:18:01:05 -0400] "GET http://www.example.com/guest/request.html HTTP/1.1" 200 4055 "http://search.live.com/results.aspx?srch=105&FORM=IE7RE&q=brockport+central"; "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; .NET CLR 1.0.3705; .NET CLR 1.1.4322; Media Center PC 4.0)" TCP_REFRESH_HIT:DIRECT
> >>> 192.168.3.3 - - [07/May/2008:18:01:06 -0400] "GET http://www.example.com/HOME.png HTTP/1.1" 200 1245 "http://www.example.com/guest/request.html"; "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; .NET CLR 1.0.3705; .NET CLR 1.1.4322; Media Center PC 4.0)" TCP_MISS:DIRECT
> >>> 192.168.3.3 - - [07/May/2008:18:01:06 -0400] "GET http://www.example.com/spacer.gif HTTP/1.1" 200 1347 "http://www.example.com/guest/request.html"; "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; .NET CLR 1.0.3705; .NET CLR 1.1.4322; Media Center PC 4.0)" TCP_MISS:DIRECT
> >>>
> >>> 	It seems, though, after that, either the 302 wasn't abided
> >>> by.
> >>>
> >>> 	Places to look?
> >>>
> >>> 			Thanks, Tuc	
> >> Start with squid -v
> >>
> >> We need to know what version you are talking about in order to provide 
> >> good help.
> >>
> > 	Very sorry. 
> > 
> > Squid Cache: Version 2.6.STABLE20+ICAP
> > configure options:  '--bindir=/usr/local/sbin' '--sbindir=/usr/local/sbin' '--da
> > tadir=/usr/local/etc/squid' '--libexecdir=/usr/local/libexec/squid' '--localstat
> > edir=/usr/local/squid' '--sysconfdir=/usr/local/etc/squid' '--enable-removal-pol
> > icies=lru heap' '--disable-linux-netfilter' '--disable-linux-tproxy' '--disable-
> > epoll' '--enable-auth=basic ntlm digest' '--enable-basic-auth-helpers=DB NCSA PA
> > M MSNT SMB YP' '--enable-digest-auth-helpers=password' '--enable-external-acl-he
> > lpers=ip_user session unix_group wbinfo_group' '--enable-ntlm-auth-helpers=SMB' 
> > '--enable-negotiate-auth-helpers=squid_kerb_auth' '--with-pthreads' '--enable-st
> > oreio=ufs diskd null aufs coss' '--enable-delay-pools' '--enable-snmp' '--enable
> > -ssl' '--with-openssl=/usr' '--enable-icmp' '--enable-htcp' '--enable-forw-via-d
> > b' '--enable-cache-digests' '--enable-wccpv2' '--enable-referer-log' '--enable-u
> > seragent-log' '--enable-arp-acl' '--enable-pf-transparent' '--enable-ipf-transpa
> > rent' '--enable-follow-x-forwarded-for' '--enable-icap-support' '--with-large-fi
> > les' '--enable-large-cache-files' '--enable-stacktraces' '--enable-err-languages
> > =Armenian Azerbaijani Bulgarian Catalan Czech Danish  Dutch English Estonian Fin
> > nish French German Greek  Hebrew Hungarian Italian Japanese Korean Lithuanian  P
> > olish Portuguese Romanian Russian-1251 Russian-koi8-r  Serbian Simplify_Chinese 
> > Slovak Spanish Swedish  Traditional_Chinese Turkish Ukrainian-1251  Ukrainian-ko
> > i8-u Ukrainian-utf8' '--enable-default-err-language=English' '--prefix=/usr/loca
> > l' '--mandir=/usr/local/man' '--infodir=/usr/local/info/' 'i386-portbld-freebsd7
> > .0' 'build_alias=i386-portbld-freebsd7.0' 'host_alias=i386-portbld-freebsd7.0' '
> > target_alias=i386-portbld-freebsd7.0' 'CC=cc' 'CFLAGS=-O2 -fno-strict-aliasing -
> > pipe   -I/usr/include -g' 'LDFLAGS= -rpath=/usr/lib:/usr/local/lib -L/usr/lib' '
> > CPPFLAGS='
> > 
> >> Second is looking at you redirector. Is it now sending 302 and an 
> >> unchanged URL out?
> >>
> > 	In the perl program it sends :
> > 
> > print "302:http://www.example.com/guest/request.html\n";; 
> > 
> > 	when I want them to be redirected :
> > 
> >   @X = split;
> >   $url = $X[0];
> >   print "$url\n";
> > 
> > 	If not.
> >> Then the redirect_access?
> >>
> > 	Eh? Whats dat?
> 
> Small typo on my part. But that name is now obsolete. It's a 
> url_rewrite_* control.
> http://www.squid-cache.org/Versions/v2/2.6/cfgman/url_rewrite_access.html
> 
> You could use that coupled with a urlpath_regex ACL to get around your 
> troublesome re-writer logics and only pass the URI you want to re-write 
> to the re-writer.
>
	I have to use a regex either way. I'd rather it be in a perl
program where I can test more. I don't mind passing everything to the
rewriter. If its not a get, or doesn't match  /\/$/ it gets passed along.
Its troublesome only in that I've not found the perfect regex I want
to use. If I do /[com|edu|net|us|org]\// then it means they have to
visit the MAIN page of the site. I still want to capture them if they
go to http://www.example.com/some/subdirectory/ . The problem is that
I've run into CGI's that are using "/" to pass field information
along. (Admittedly, I do the same myself for CGI's I've written, but 
I NEVER end with a "/" hanging out in the breeze)
>
> Then again. With the same ACL you could do a fancy deny_info redirection 
> instead of re-writing anything:
> 
>   acl known_ips src ./database_of_non-redirected_ips
>   acl redirect urlpath_regex ^/$
>   deny_info http://somewhere.example.com redirect
>   http_access deny !known_ips redirect
>
	But I don't want to redirect 100% of the time. I want to
redirect when *I* decide that it should redirect. The way I do
this is a DB_File that keeps :

valhalla# /tmp/dumpsquidintercept.pl 
192.168.3.3 = 1210335746|http://www.home.nyu.edu/

	Where the IP WANTED to go, and when. Then, depending on how
often I want to have the message show up again , I do :

valhalla# cat /tmp/deletesquidintercept.pl 
#!/usr/local/bin/perl

use DB_File;
use Fcntl;

$x=tie (%fdb,'DB_File',"/tmp/squidintercept",O_RDWR|O_CREAT,0777,$DB_HASH) ||die $!;

delete $fdb{'192.168.3.3'};
$x->sync();
untie %fdb; 
> 
> >> Or should you really be using url_rewrite directives?
> >>
> > 	I was going by :
> > 
> > http://wiki.squid-cache.org/SquidFaq/SquidRedirectors
> > 	and
> > http://wiki.squid-cache.org/ConfigExamples/PhpRedirectors
> > 
> > 	I didn't see either mentioned. 
> 
> Ah, FAQ needed updating. Thank you.
> 
> > 
> > 	My program "randomly" invokes the :
> > 
> > print "302:http://www.example.com/guest/request.html\n";; 
> > 
> > 	line until I set a flag in the filesystem to stop it.
> > 
> > 	I've found that if the URL that matches my conditions
> > is already in the cache, it seems to ignore the 302. If its a
> > new site that has never seen the light of the cache, it works.
> 
> Definately a bug then.
> May be related to #7
>   http://www.squid-cache.org/bugs/show_bug.cgi?id=7
>
	Dunno. Above my head.
> 
> Or it could be a different one with the store looking for its key 
> information in the wrong place.
> 
	Any way to fix it? I can't be guaranteed that the user will
look at fresh content quick enough for my needs for them.


		Thanks, Tuc

[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux