-
Notifications
You must be signed in to change notification settings - Fork 8
Home
The HSBencher suite of tools are written in Haskell and designed to run and aggregate benchmark data from a distributed execution environment. All the packages can be installed from Hackage.. Different packages provides different components that work together in the complete benchmarking system:
- Benchmark-running (
hsbencher
pkg) - Benchmark data uploading (
hsbencher-fusion
,hsbencher-codespeed
) - Benchmark data fetching and plotting (
hsbencher-tool
,hsbencher-graph
)
Most use cases break down into (1) those which use HSBencher as a harness to run benchmarks, and (2) those which run benchmarks by some other means, but use HSBencher to upload the data post-facto to a fusion table and enable using the rest of the HSBencher tools.
In the future, other database backends should be supported, but right now the focus is on using Google Fusion Tables to store, retrieve, and view benchmark data.
In this scenario, the user creates a benchmark suite which after each benchmarking run outputs a file in one of two formats:
- Criterion report -- a binary file containing Criterion benchmark. data. Upload with the
hsbencher-fusion-upload-criterion
command. - CSV file -- typically using a subset/superset of the standard HSBencher benchmark schema. Upload with the
hsbencher-fusion-upload-csv
command.
Both commands should be run in the git repository that contains the benchmarks just ran, and on the machine that ran them. Because the above files (especially Criterion reports) do not contain complete metadata about the benchmarking environment, these upload commands harvest that data from the current machine and the current directory before uploading, so they must be run in the right place.
In both cases, it is good to read about the SchemaStyleGuide to see a recommended way to use the different columns in the benchmark data schema.
The goal of an HSBencher-based harness is to run a bunch of benchmarks, each as a single process and as the only thing running on a machine, and then repeat this process, varying across a parameter space. Properties of these runs (e.g. time elapsed) are recorded and then stored in some kind of database (see "backends" below), where they can be retrieved.
HSBencher works well with using systems like Jenkins to launch benchmark jobs on many machines, while relying on HSBencher to aggregate the benchmark data back to a single place.
Currently, adding an HSBencher script to a project requires writing Haskell code to enumerate the benchmarks and describe their configuration space. In the future we may support using simple text files to describe benchmark configurations (as we did in a much earlier version), but for now Haskell coding is required if you want HSBencher to run your benchmarks. Thus, the remainder of this page is split into two sections: to help you either (1) use an existing HSBencher setup (no Haskell coding required), or (2) create a new HSBencher setup.
- Create your own Fusion table
-
Build the benchmark harness executable (e.g.
run-benchmarks.hs
) - Run the benchmark executable
Backends are plugins, possibly provided by other packages, that can be added to an HSBencher harness to upload the data to a particular destination. We mainly use the Fusion Table backend as of 2014/2015.
- DribbleBackend -- a simple backend for outputting to a local CSV file.
- FusionTableBackend -- Upload data to a Google Fusion Table. Corresponds to module HSBencher.Backend.Fusion, package hsbencher-fusion.
- BigQueryBackend -- This doesn't exist yet, but is planned for Tableau integration.
Build methods are Haskell objects that tell the benchmark Harness how to compile and run a particular kind of benchmark, i.e. one that uses a particular build system.
-
BuiltInBuildMethods -- rudimentary cabal, ghc, and make
BuildMethod
s that ship with the corehsbencher
package. Note that there are many ways to use GHC, cabal, and make, so each of these built-in build methods establishes some particular conventions. -
CustomBuildMethod_HowTo -- you can always add custom
BuildMethod
s for a project, and we often do
If you are interested in contributing build methods for common build systems like Maven or Scala's sbt, please let us know.