Re: Compression speed for large files

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 




On Tue, 4 Jul 2006, Joachim Berdal Haga wrote:
> 
> Here's a test with "time gzip -[169] -c file >/dev/null". Random data
> from /dev/urandom, kernel headers are concatenation of *.h in kernel
> sources. All times in seconds, on my puny home computer (1GHz Via Nehemiah)

That "Via Nehemiah" is probably a big part of it.

I think the VIA Nehemiah just has a 64kB L2 cache, and I bet performance 
plummets if the tables end up being used past that. 

And I think a large part of the higher compressions is that they allow the 
compression window and tables to grow bigger.

		Linus
-
: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]