On Tue, Jul 31, 2007, Leonardo Rodrigues Magalh??es wrote: > What i have noticed, on the last 3-4 years, is that the BYTE hit > ratio is getting lower each year. And that's somehow expectable. Several > sites, including those who have stale content, are starting to use site > generator systems, just like Wiki and other stuff. Not to count that > sites are really going dynamic and setting the expire values correctly. > All of these are making the BYTE hit ratio get lower on the last years, > at least for me. We can't forget all the multemedia content, who usually > dont get cached because of maximum_object_size. All these stuff > contributes for the byte hit ratio being NOT too high. I gave a talk at NANOG almost 7 years ago on caching which had a slide on effective byte hit rate as a function of request rate + object size (IIRC.) The point I was making was that small objects are important then, but large objects giving a much better byte hit rate return and that'll be important in later years. And it is :) You should look at your site traffic breakdown and identify which sites are the busiest. You'll probably find video sites up there; and video-over-HTTP would be pretty cachable if the content providers made it so. Some people have hacked up Squid to test stuff like caching Youtube - and there's -definitely- a benefit in caching it - but noone wants to contribute it publicly. So caching is still relevant, but the people who would benefit from further work and the people working on open source caching solutions aren't talking (Well, they're talking, but not more than "it would be nice" from side A and "I have to eat" from side B.) Adrian