Search squid archive

Re: GZIP and Squid on a high performance website?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Henrik Nordström wrote the following on 24.02.2010 00:15:
> tis 2010-02-23 klockan 22:07 +0100 skrev Gerrit Berkouwer:
>> What do you mean by "make sure your authoring system updates both"? Do
>> you mean that Apache wll not recognize new content by itself and thus
>> make a brand new .gz file every time the content changes?
> 
> Not automatically no.

I can't vouch for the authority of the site, but
http://schroepl.net/projekte/mod_gzip/config.htm implies that mod_gzip
can actually replace outdated precompressed files when the uncompressed
'companion' file is newer than the precompressed file present:

"# automatic updates for statically precompressed files
  mod_gzip_update_static        No
# (if set to 'Yes', this directive (being new in version 1.3.26.1a) would
# cause mod_gzip to automatically update an outdated version of any
# statically precompressed file during the request, i. e. compress the
# originally requested file and overwrite the precompressed variant
# file with it!"

This means that you would need to 'preseed' the web site with
precompressed copies of all files to be provided in compressed form, but
after that mod_gzip would handle updating those files by itself.

Please note that I don't have any mod_gzip experience myself, I just got
intrigued by this thread and went exploring a little.

Cheers,
Tobias


[Index of Archives]     [Linux Audio Users]     [Samba]     [Big List of Linux Books]     [Linux USB]     [Yosemite News]

  Powered by Linux