Skip to content

Latest commit

 

History

History
76 lines (46 loc) · 2.52 KB

README.md

File metadata and controls

76 lines (46 loc) · 2.52 KB

DiRAC Benchmarks (Extreme Scaling)

This repository contains data and analysis for benchmarking the Grid data parallel C++ mathematical object library on the DiRAC systems.

Grid

Obtaining the source code

The source code can be obtained from the public Github repository:

All benchmarks were run using the source from the dirac-ITT branch. The typical process for checking out the code would be:

git clone https://[email protected]/paboyle/Grid
cd Grid
git checkout release/dirac-ITT

Compiling GRID

General information on compiling Grid can be found at:

with specific instructions for the benchmark at:

The build process for each of the systems we have run on can be found in the doc/ subfolder.

Running Grid benchmark

Information on running the DiRAC Grid benchamrk can be found at:

Note that the MPI task distribution across problem dimensions used at different node counts follows the following rules:

  • Distribute as evenly as possible
  • Larger numbers should be in the left-most dimension

This leads to the following distributions when using 2 MPI tasks per node (usually best on two-socket nodes):

  • 1 node (2 MPI tasks): 2.1.1.1
  • 2 nodes (4 MPI tasks): 2.2.1.1
  • 4 nodes (8 MPI tasks): 2.2.2.1
  • 8 nodes (16 MPI tasks): 2.2.2.2
  • 16 nodes (32 MPI tasks): 4.2.2.2
  • 32 nodes (64 MPI tasks): 4.4.2.2
  • etc.

The benchmark has been run on the following systems:

System Node Interconnect Combinations
DiRAC Memory Intensive (COSMA7) 2x Intel Xeon Skylake Gold 5120, 512 GiB DDR4 Mellanox EDR Intel 2018; Intel MPI 2018
DiRAC Extreme Scaling (Tesseract) 2x Intel Xeon Skylake Silver 4112, 92 GiB DDR4 Intel OPA Intel 2018; Intel MPI 2018

Output from benchmark runs can be found in the results/ subdirectory.

Evaluating benchmark performance

Information on evaluating performance can be found at:

Scripts we have used to extract performance and plot comparisons can be found in the analysis/ subdirectory.

Further information

If you want further information on these benchmarks, please contact the DiRAC Helpdesk at: [email protected]:1