Skip to content

Commit

Permalink
update test/perf README (fix JuliaLang#19853)
Browse files Browse the repository at this point in the history
  • Loading branch information
jrevels committed Jan 4, 2017
1 parent d8c9b0c commit 28bcb4b
Showing 1 changed file with 22 additions and 22 deletions.
44 changes: 22 additions & 22 deletions test/perf/README.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,15 @@
Julia performance monitoring
============================

This directory contains tests and related utilities to monitor Julia's
performance over time.
This directory contains benchmarks and related utilities to test Julia's
performance. Many of these benchmarks have been ported to the newer
[BaseBenchmarks package](https://github.com/JuliaCI/BaseBenchmarks.jl),
which contains the benchmark suite used for CI performance testing.
In general, new benchmarks should be added to that package instead
of placed here (see the BaseBenchmarks README for details).

If you'd like to test the performance of your own package, consider using
the [BenchmarkTools package](https://github.com/JuliaCI/BenchmarkTools.jl).

Running the performance tests
-----------------------------
Expand All @@ -15,10 +22,11 @@ test runs in micro seconds.
There is also a `perfcomp.jl` script but it may not be working with
the rest at the moment.

Adding tests
------------
First decide whether the new tests should go into one of the existing
suites:
Code Organization
-----------------

Tests generally go into one of the following suites:

- `micro`: A set of micro-benchmarks commonly used to compare
programming languages; these results are shown on
[http:https://julialang.org/](http:https://julialang.org/).
Expand All @@ -36,28 +44,20 @@ suites:
[Peter Norvig's spelling corrector](http:https://norvig.com/spell-correct.html).
- `sparse`: Performance tests of sparse matrix operations.

Otherwise add a subdirectory containing the file `perf.jl` and
update the `Makefile` as well.
Otherwise tests live in their own subdirectories containing a `perf.jl` file.

In `perf.jl`, `include("../perfutil.jl")` and then run the
performance test functions with the `@timeit` macro. For example:
The `perf.jl` files include the shared performance utilies via
`include("../perfutil.jl")`, and then run the performance test
functions with the `@timeit` macro. For example:
```julia
@timeit(spelltest(tests1), "spell", "Peter Norvig's spell corrector")
```
with arguments: test function call, name of the test, description,
and, optionally, a group (only used for codespeed). `@timeit` will do
a warm-up and then 5 timings, calculating min, max, average and standard
deviation of the timings.

If possible aim for the tests to take about 10-100 microseconds.

Using the framework for your own tests
--------------------------------------

Just include `perfutil.jl`, use `@timeit` on the functions to be
benchmarked. Alternatively have a look at the
[BenchmarkTools package](https://github.com/JuliaCI/BenchmarkTools.jl).
and, optionally, a group. `@timeit` will do a warm-up and then 5
timings, calculating min, max, average and standard deviation of
the timings.

If possible, the tests aim to take about 10-100 microseconds.

Package dependencies
--------------------
Expand Down

0 comments on commit 28bcb4b

Please sign in to comment.