Skip to content

Commit

Permalink
update test/perf README (fix #19853) [ci skip]
Browse files Browse the repository at this point in the history
  • Loading branch information
jrevels committed Jan 4, 2017
1 parent 7eadb55 commit 70b0416
Showing 1 changed file with 22 additions and 27 deletions.
49 changes: 22 additions & 27 deletions test/perf/README.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,15 @@
Julia performance monitoring
============================

This directory contains tests and related utilities to monitor Julia's
performance over time. The results are presented on
[http:https://speed.julialang.org/](http:https://speed.julialang.org/).
This directory contains benchmarks and related utilities to test Julia's
performance. Many of these benchmarks have been ported to the newer
[BaseBenchmarks package](https://github.com/JuliaCI/BaseBenchmarks.jl),
which contains the benchmark suite used for CI performance testing.
In general, new benchmarks should be added to that package instead
of placed here (see the BaseBenchmarks README for details).

If you'd like to test the performance of your own package, consider using
the [BenchmarkTools package](https://github.com/JuliaCI/BenchmarkTools.jl).

Running the performance tests
-----------------------------
Expand All @@ -13,17 +19,14 @@ the sub-directories and display the test name with the minimum,
maximum, mean and standard deviation of the wall-time of five repeated
test runs in micro seconds.

Calling `make codespeed` is for generating the results displayed on
[http:https://speed.julialang.org/](http:https://speed.julialang.org/), probably
not what you want.

There is also a `perfcomp.jl` script but it may not be working with
the rest at the moment.

Adding tests
------------
First decide whether the new tests should go into one of the existing
suites:
Code Organization
-----------------

Tests generally go into one of the following suites:

- `micro`: A set of micro-benchmarks commonly used to compare
programming languages; these results are shown on
[http:https://julialang.org/](http:https://julialang.org/).
Expand All @@ -41,28 +44,20 @@ suites:
[Peter Norvig's spelling corrector](http:https://norvig.com/spell-correct.html).
- `sparse`: Performance tests of sparse matrix operations.

Otherwise add a subdirectory containing the file `perf.jl` and
update the `Makefile` as well.
Otherwise tests live in their own subdirectories containing a `perf.jl` file.

In `perf.jl`, `include("../perfutil.jl")` and then run the
performance test functions with the `@timeit` macro. For example:
The `perf.jl` files include the shared performance utilies via
`include("../perfutil.jl")`, and then run the performance test
functions with the `@timeit` macro. For example:
```julia
@timeit(spelltest(tests1), "spell", "Peter Norvig's spell corrector")
```
with arguments: test function call, name of the test, description,
and, optionally, a group (only used for codespeed). `@timeit` will do
a warm-up and then 5 timings, calculating min, max, average and standard
deviation of the timings.

If possible aim for the tests to take about 10-100 microseconds.

Using the framework for your own tests
--------------------------------------

Just include `perfutil.jl`, use `@timeit` on the functions to be
benchmarked. Alternatively have a look at the
[Benchmark package](https://github.com/johnmyleswhite/Benchmark.jl).
and, optionally, a group. `@timeit` will do a warm-up and then 5
timings, calculating min, max, average and standard deviation of
the timings.

If possible, the tests aim to take about 10-100 microseconds.

Package dependencies
--------------------
Expand Down

0 comments on commit 70b0416

Please sign in to comment.