From: "Christian Couder" <christian.couder@xxxxxxxxx>
This patch series is built on top of cc/perf-run-config which recently
graduated to master.
It makes it possible to send perf results to a Codespeed server. See
https://github.com/tobami/codespeed/ and web sites like
http://speed.pypy.org/ which are using Codespeed.
The end goal would be to have such a server always available to track
how the different git commands perform over time on different kind of
repos (small, medium, large, ...) with different optimizations on and
off (split-index, libpcre2, BLK_SHA1, ...)
Dumb question: is this expected to also be able to do a retrospective on the
performance of appropriate past releases? That would allow immediate
performance comparisons, rather than needing to wait for a few releases to
see the trends.
Philip
With this series and a config file like:
$ cat perf.conf
[perf]
dirsOrRevs = v2.12.0 v2.13.0
repeatCount = 10
sendToCodespeed = http://localhost:8000
repoName = Git repo
[perf "with libpcre"]
makeOpts = "DEVELOPER=1 USE_LIBPCRE=YesPlease"
[perf "without libpcre"]
makeOpts = "DEVELOPER=1"
One should be able to just launch:
$ ./run --config perf.conf p7810-grep.sh
and then get nice graphs in a Codespeed instance running on
http://localhost:8000.
Caveat
~~~~~~
For now one has to create the "Git repo" environment in the Codespeed
admin interface. (We send the perf.repoName config variable in the
"environment" Codespeed field.) This is because Codespeed requires the
environment fields to be created and does not provide a simple way to
create these fields programmatically.
I might try to work around this problem in the future.
Links
~~~~~
This patch series is available here:
https://github.com/chriscool/git/commits/codespeed
The cc/perf-run-config patch series was discussed here:
v1:
https://public-inbox.org/git/20170713065050.19215-1-chriscool@xxxxxxxxxxxxx/
v2:
https://public-inbox.org/git/CAP8UFD2j-UFh+9awz91gtZ-jusq7EUOExMgURO59vpf29jXS4A@xxxxxxxxxxxxxx/
Christian Couder (8):
perf/aggregate: fix checking ENV{GIT_PERF_SUBSECTION}
perf/aggregate: refactor printing results
perf/aggregate: implement codespeed JSON output
perf/run: use $default_value instead of $4
perf/run: add conf_opts argument to get_var_from_env_or_config()
perf/run: learn about perf.codespeedOutput
perf/run: learn to send output to codespeed server
perf/run: read GIT_TEST_REPO_NAME from perf.repoName
t/perf/aggregate.perl | 164
+++++++++++++++++++++++++++++++++++---------------
t/perf/run | 29 +++++++--
2 files changed, 140 insertions(+), 53 deletions(-)
--
2.15.1.361.g8b07d831d0