Search squid archive

Re: Caching files from Amazon S3

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Well I was using getright to download the image from the client end and it has an option which allows me to see the headers sent and recieved. However even when I try with a normal browser I still get the same issue. The images are not cached.

--
abidoon
----- Original Message ----- From: "Adrian Chadd" <adrian@xxxxxxxxxxxxxxx>
To: "Abidoon Nadeem" <abidoon@xxxxxxxxxxxxxx>
Cc: <squid-users@xxxxxxxxxxxxxxx>
Sent: Sunday, March 16, 2008 10:04 AM
Subject: Re:  Caching files from Amazon S3


Annoyingly, why the hell is the request from the client a range request?
Squid can't easily cache those unless it somehow fetches the entire object
first.




Adrian

On Sat, Mar 15, 2008, Abidoon Nadeem wrote:
Hi,

We are stuck with a unique situation and require some help.

We are trying to use squid as a reverse proxy. I have been able to
configure it to cache normal images such as abc.gif residing on the local
hard drive.

I run the webserver on a virtual ip and have squid running our internet
facing ip. The request comes to squid from the client and it sees if it has
the relevant files in cacche otherwise it talks to the webserver on the
backend and fetches it.

The issue is that we want to hide where the images are coming from since
they can come from S3 or our local system. So we used mod rewrite and wrote
a rule which would pass this URL
http://oursite.com/Utilities/feeds/getimages/display/thumb/142358.jpg from
the client to squid and from squid to apache. Apache then passes this
string to getimages.php? where we break this string, decode it and then
return the jpg file.

Our response is an http response given below that the client receives

!!!! ----Header Recv----
HTTP/1.0 206 Partial Content
Date: Sat, 15 Mar 2008 16:42:21 GMT
Server: Apache/2.2.6 (Fedora)
X-Powered-By: PHP/5.2.4
Cache-Control: private, must-revalidate, max-age=0
Last-Modified: Sat, 15 Mar 2008 16:42:21 GMT
Accept-Ranges: bytes
ETag: "9518df2e0531022b1413d21978d6fb80"
Content-Type: image/jpeg
Content-Range: bytes 0-14673899/14673900
Content-Length: 14673900
X-Cache: MISS from ratchet.vistaclick.lahore
X-Cache-Lookup: MISS from ratchet.vistaclick.lahore:80
Via: 1.0 ratchet.vistaclick.lahore:80 (squid/2.6.STABLE17)
Connection: close

the issue is that when I look at the squid access log against this response
it is as follows when we press F5 :

1205517662.101     81 192.168.0.241 TCP_MISS/200 16932 GET
http://oursite.com/Utilities/feeds/getimages/display/thumb/142361.jpg -
FIRST_UP_PARENT/192.168.0.237 image/jpeg.

we get a TCP_MISS/304 when we try with CTRL F5

Now the issue is that we have manipulated the http headers as much as we
can using the php http library however we cannot get squid to cache these
images. It caches normal images just fine however when we process the image
though our getimages.php script we only get the entries above in the log
file.

Please let me know if there is anyway to make squid images that we process
through PHP.

Any help you can offer would be greatly appreciated by a squid newbie :)

--
Abidoon

SQUID CONFIGURATION FILE BELOW
------------------------------------------------------------------------------------------------------------------

# $Id: squid.conf,v 1.1 2005/07/16 22:24:57 jmates Exp $

#visible_hostname

#http_port 80  vhost
http_port 192.168.0.236:80  defaultsite=www.mysite.com vhost
cache_peer 192.168.0.237  parent 80 0 no-query originserver


maximum_object_size 15360 KB
# minimum_object_size 0 KB

maximum_object_size_in_memory 1500 KB

cache_replacement_policy lru
memory_replacement_policy lru

logfile_rotate 3

cache_log none
cache_store_log none

cache_access_log /var/log/squid/access_log

# use local caching name server, avoid /etc/hosts
dns_nameservers 208.67.222.222
hosts_file none

#auth_param basic children 5
#auth_param basic realm Squid proxy-caching web server
#auth_param basic credentialsttl 2 hours

request_header_max_size 16 KB

negative_ttl 1 minutes
negative_dns_ttl 1 minutes

# connect_timeout 1 minutes
# peer_connect_timeout 30 seconds
# read_timeout 10 minutes
# request_timeout 5 minutes
# persistent_request_timeout 2 minute
# half_closed_clients on

acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80          # http
acl Safe_ports port 21          # ftp
acl Safe_ports port 443 563     # https, snews
acl Safe_ports port 70          # gopher
acl Safe_ports port 210         # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280         # http-mgmt
acl Safe_ports port 488         # gss-http
acl Safe_ports port 591         # filemaker
acl Safe_ports port 777         # multiling http
acl CONNECT method CONNECT

http_access allow manager localhost
#http_access deny manager
# Deny requests to unknown ports
#http_access deny !Safe_ports
# Deny CONNECT to other than SSL ports
#http_access deny CONNECT !SSL_ports

#http_access deny to_localhost

#acl okdomains dstdomain www2.noog.com
#http_access deny !to_localhost
http_access allow all

# And finally deny all other access to this proxy
#http_access deny all

http_reply_access allow all

#icp_access deny all
miss_access allow all

ident_lookup_access deny all

cache_mgr webmaster@xxxxxxxxxx

cache_effective_user squid

#httpd_accel_host 192.168.0.5
#httpd_accel_port 81
#httpd_accel_uses_host_header off
#httpd_accel_single_host on
#httpd_accel_with_proxy on

forwarded_for on

log_icp_queries on

#snmp_port 0
#snmp_access deny all

# offline_mode off
coredump_dir none

pipeline_prefetch on

mime_table /etc/squid/mime.conf



--
- Xenion - http://www.xenion.com.au/ - VPS Hosting - Commercial Squid Support -
- $25/pm entry-level VPSes w/ capped bandwidth charges available in WA -



[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux