Hello, Ive setup my first reverse proxy to accelerate a site, ive used wget to spider the entire site several times and noticed that even after running it some files never get cached like html files! I presume it is because the htmls dont have the correct cache headers. It didnt even want to cache up .swf files, but then I added this line and it helped a lot but not completely. refresh_pattern . 0 20% 4320 ignore-reload Iam thinking then the best approach is to make squid cache EVERYTHING, and then manually give it specific exceptions of dynamic content ( like .php and some .html with embedded php scripts). I dont want to start editting files because I want to test the performance increase before adding headers one by one. If anybody suggests something better,, could someone advise me how to force it to cache everything and an example on how to make exceptions? Ive been looking at the faq without much help... Ive also tried using refresh_pattern \.html$ 60 80% 180 refresh_pattern \.htm$ 60 80% 180 With the objective of forcing it to cache, but it isnt working. Thanks!