Eli Zaretskii wrote:
From: Andreas Ericsson <ae@xxxxxx>
I *think* (correct me if I'm wrong) that git is still faster
than a whole bunch of other scm's on windows, but to one who's used to
its performance on Linux that waiting several seconds to scan 10k files
just feels wrong.
Unless that 10K is a typo and you really meant 100K, I don't think 10K
files should take several seconds to scan on Windows. I just tried
"find -print" on a directory with 32K files in 4K subdirectories, and
it took 8 sec elapsed with a hot cache. So 10K files should take at
most 2 seconds, even without optimizing file traversal code. Doing
the same with native Windows system calls ("dir /s") brings that down
to 4 seconds for 32K files.
It was a typo. Thanks for correcting me.
On the other hand, what packages have 100K files? If there's only one
-- the Linux kernel -- then I think this kind of performance is for
all practical purposes unimportant on Windows
But it's most definitely not. The *huge* projects that have looked at
git have sometimes turned it down simply because they're either cross-
platform (Mozilla) or they have translators that use windows exclusively
(KDE and Mozilla, just to mention two).
Both Mozilla and KDE repos are *much* larger than the Linux repo.
--
Andreas Ericsson andreas.ericsson@xxxxxx
OP5 AB www.op5.se
Tel: +46 8-230225 Fax: +46 8-230231
-
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at http://vger.kernel.org/majordomo-info.html