Search squid archive

Re: squid refresh_pattern - different url with same XYZ package

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi Eliezer

Thanks, but i have another question. we are force to know about mirror address for sf.net? or for example filehippo.com?

or is it not important?

and can you explain me about nginx variable used in site.conf? for example $host, $uri, $request_uri. i used same method with apache and with intelligentmirror 3 years ago.


thanks,

/*Eliezer Croitoru <eliezer@xxxxxxxxxxxx>*/ wrote on Tue, 03 Apr 2012 11:10:10 +0300:
On 03/04/2012 09:37, Amos Jeffries wrote:
On 3/04/2012 5:57 a.m., Mohsen Saeedi wrote:
Hi

I have a problem with squid refresh_pattern. i used regex on
refresh_pattern and every exe file for example cached and then clients
can download it with high rate. but when someone download from some
website(for example mozilla or filehippo) , they redirect to different
url but the same XYZ exe file. for example firefox-version.exe cached
to the disk but when another clients send new request, it redirect
automatically to different url for downloading same firefox. how can i
configure squid for this condition?


By altering your regex pattern to work with both URL. Or adding a
different pattern to match the alternative URL.
if you have some examples for patterns it can help simplify the problem.
i think i do understand and if so i just recently implemented cache for sourceforge using nginx.
as of filehippo it's different.
let say i am downloading one file of total commander the links from couple of servers will be:

http://fs31.filehippo.com/7077/9965e6338ead4f6fb9d81ac695eae99a/tc80beta24.exe

http://fs30.filehippo.com/6386/9965e6338ead4f6fb9d81ac695eae99a/tc80beta24.exe

http://fs33.filehippo.com/6957/9965e6338ead4f6fb9d81ac695eae99a/tc80beta24.exe

so there is a basic url match pattern but you must break the path and it's a bit complicated.
but as for source forge i will share the method i have used.
the following is the nginx site.conf content:
#start
server {
  listen       127.0.0.1:8086;

  location / {
    root /usr/local/www/nginx_cache/files;
    try_files "/sf$uri" @proxy_sf.net;
  }

  location @proxy_sf.net {
    resolver 192.168.10.201;
    proxy_pass http://$host$request_uri;
    proxy_temp_path "/usr/local/www/nginx_cache/tmp";
    proxy_store "/usr/local/www/nginx_cache/files/sf$uri";

    proxy_set_header X-SourceForge-Cache "eliezer@xxxxxxxxxxxx";
    proxy_set_header Accept "*/*";
    proxy_set_header User-Agent "sourceforge Cacher (nginx)";
    proxy_set_header Accept-Encoding "";
    proxy_set_header Accept-Language "";
    proxy_set_header Accept-Charset "";
    proxy_set_header Cache-Control "";
    access_log /var/log/nginx/sf.net.access_log;
  }
}

#end of nginx site.conf

#on squid.conf i used:
acl sfdl dstdom_regex (dl\.sourceforge\.net)$

cache_peer local_sf parent 8086 0 no-query no-digest proxy-only
cache_peer_access local_sf allow sfdl
cache_peer_access local_sf deny all

never_direct allow sfdl
never_direct deny all

cache deny sfdl
cache allow all

#on the hosts file i added:
127.0.0.1       local_sf

#done
the mail problem with nginx as a proxy that it will in any case will download the full file from the source. means if your client abort the download it will still download the file as far as i remeber.(not 100% sure if what caused it was squid or nginx).

i also used nginx for other sites that stores images such as imageshack.us with almost the same method cause it seems like nginx will serve the files very very fast and because of the huge amount of objects i spared from squid index file to store the objects information on the images.

Regards,
Eliezer


NOTE: refresh_pattern has nothing to do with where squid caches the
object or where it fetches from. Only how long cacheable things get stored.

Amos



[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux