On Tue, 1 Feb 2005, Eric S. Raymond wrote:
OK, here is an answer.
Here's another one ;)
Processor clocks are getting cheaper faster than memory is. Memory is getting cheaper faster than disk is. Disk is getting cheaper a *lot* faster than bits-per-second of bandwidth.
Memory is getting bigger a lot faster than it is getting faster. Disk is getting bigger a lot faster than it is getting faster. No matter at which part of a computer system you look, space increases more than speed does.
The "right" tradeoff is to use lots of cheap resources in order to be able to use fewer expensive resources. Therefore, as these trends continue,
... we no longer need to care as much about the size of files, since the real bottleneck is how fast we can get at the data.
I know my ADSL can download bzip2 files faster than I can decompress them...
There is no perfect answer, but gzip seems to provide a good compromise between fast and small.
-- "Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it." - Brian W. Kernighan