Peter Stahlir wrote: > But the thing is, I think there is a lot of redundancy in > a) a Debian mirror or Yes, surely. Your idea suggests that you want any file to be reconstructed on-the-fly whenever it's being requested. Isn't there the danger of killing performance, the CPU being the bottleneck? I imagine such a debian mirror has quite some traffic. > b) your disk at home. I doubt so. There sure is lots of redundancy within each file and that's what compressed file systems are good for. But what you talk about is redundancy across (unversioned) files, and I don't feel there is a lot of it. Yes, I might have a few copies of the file COPYING on my disk, and maybe some of my sources share a few functions, but this won't save me tons of space. All my binaries, libraries, MP3s, videos, config files, etc don't really have any redundancy across file boundaries. And even if there is, finding that redundancy is an O(whatever-but-not-n) operation that would be rather slow. I definitely see gitfs (or similar ideas) as potentially being useful in some cases (maybe debian mirrors could be one), but not for my disk at home, which I generally would prefer to be faster than more compressed. jlh - To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html