On Tue, May 18, 2010 at 8:37 PM, Anthony W. Youngman <wol@xxxxxxxxxxxxxxxxxxxxx> wrote: > Just because YOUR computer is modern and is happy being fed a new bigger > hard drive doesn't mean they all are. This computer here has 3/4gig ram. > Tiny by modern standards but I can't put any more in - it only has three > slots at 256Mb maximum each. And it's got a 250Gb drive but it can only use > the first 128Gb (I'm being economical with the truth here, but hey...) > > Anyways. Why should hundreds of people have to throw out thousands of > serviceable machines just because a few programmers can't be assed to at > least TRY to be economical with their usage of resources? It's a tradeoff. There are a bunch of programs that can sync files back and forth *without* keeping a history - and those tools are mostly not used. IMHO that's because they're too complicated and dangerous; if something goes wrong with your sync, the mistakenly-deleted-or-modified files are gone for good. If I care enough about my files to want to replicate them for safety, then I care too much about them to trust them to an unpredictable sync algorithm. A version control system like git, on the other hand, makes a different tradeoff: you can be reasonably sure that it'll *never* permanently lose data, but to get that assurance, you're going to pay for it in disk space. If you want to use yesterday's computers, you're probably going to have to be satisfied with yesterday's solutions. AFAIK, home directory replication has never been adequately solved. Of course, someone could still come along and invent an elegant, fast, reliable, space-efficient, trustworthy solution to this problem. But I don't think that person has been along yet. Have fun, Avery -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html