Ken Brownfield wrote: > git filter-branch --index-filter 'git rm -r --cached --ignore-unmatch -- bigdirtree stuff/a stuff/b stuff/c stuff/dir/{a,b,c}' --prune-empty --tag-name-filter cat -- --all [...] > Now that the same repository has grown, this same filter-branch > process now takes 6.5 *days* at 100% CPU on the same machine (2x4 > Xeon, x86_64) on git-1.7.3.2. There's no I/O, memory, or other > resource contention. If all you do is an index-filter for deletion, I think it should be rather easy to achieve good results by filtering the fast-export stream to remove these files, and then piping that back to fast-import. (It's just that AFAIK nobody has written that code yet.) -- Thomas Rast trast@{inf,student}.ethz.ch -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html