Landed in https://github.com/l20n/l20n.js/commit/98417af75d9372fe0944e4d039477ddfbf6e1c63
With process.hrtime, we measure time in nanoseconds, which I then convert to microseconds. This allows to get rid completely of artificial iterations (with Date.now() the maximal resolution was milliseconds, and the script had to repeat all operations 500 or 1000 times to get numbers bigger then zero.)
For instance, running make perf on today's master gives us the following results:
parse:
mean: 307.16 µs
stdev: 142.38 µs
sample: 500
compile:
mean: 175.65 µs
stdev: 56.1 µs
sample: 500
get:
mean: 64.8 µs
stdev: 13.66 µs
sample: 500
A single compilation of all entities found in tools/perf/example.lol takes 175 microseconds, i.e. 0.175 of a millisecond and 0.000175 of a second.
In addition, I added A Student's t-test which tests a hypothesis that two given means are equal. You first need to establish a benchmark, and you can do that with `make reference`. See the Makefile for detailed commands. A reference.json file is saved in tools/perf and each following `make perf` will test if the deviations from the reference are statistically significant.
The default significance level for running the test is 0.01. Here's how to understand this number: for a difference between means to be significant, the probability of obtaining a mean at least as extreme as the one that was actually observed (the new mean), assuming that the null hypothesis is true, has to be less than 1%.

## Description

•