On Wed, Aug 7, 2013 at 1:59 PM, Akash Jain <akash.delhite@xxxxxxxxx> wrote:I think whoever you're talking about at Akamai isn't being very
> Per Akamai Guy, Vary shows akamai that content can vary so akamai is not
> caching, and this leading akamai to make requests to our webversion ...
> We mostly just use JS and CSS to be served from akamai ..
helpful. I know at a minimum you can simply not use compression
between you and Akamai and then turn on content-acceleration and Akmai
will do the compression for you. But I'm pretty sure they can also
support compression from the origin as well.
Using a random css file from Godady's website:
http://img2.wsimg.com/pc_css/1/gd_H_20130624_http.min.css
If I do the following with and without the --compressed I see that the
file is cached:
$ curl -H 'Pragma: akamai-x-cache-on, akamai-x-get-cache-key,
akamai-x-get-true-cache-key, akamai-x-serial-no' -v -o /dev/null
http://img2.wsimg.com/pc_css/1/gd_H_20130624_http.min.css
(note the X-Cache response with TCP_MEM_HIT).
Using the X-Cache-Key header you can find the origin server which is
images.secureserver.net in this case...
Hitting it like so:
$ curl --compressed -v -o /dev/null
http://img2.wsimg.com/pc_css/1/gd_H_20130624_http.min.css
I see that they are using Content-Encoding: gzip and Vary: Accept-Encoding.
I'm not sure if there's some config they have on their side to avoid
Akamai request compression or for their origin server to refuse to
give Akamai gzip. Unfortunately I don't have an Akamai setup anymore
to play with.
Thing is Akamai benefits from properly supporting this because their
bandwidth bill to retrieve data from the origin server goes down.
---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@xxxxxxxxxxxxxxxx
For additional commands, e-mail: users-help@xxxxxxxxxxxxxxxx