Pyrite

Pyrite: MPI Communication Tracking and Visualiser

Pyrite is a lightweight MPI tracing and visual analytics toolchain for HPC applications.

This project intercepts MPI communication at runtime, records a compact binary trace, converts it into a streamed visualisation format, extracts common communication patterns and likely performance problems, and provides a web based rendering tool to display the result on a 3D representation of your physical hardware topology.

Features

Visualiser Interface

MPI Communication Visualiser GUI The web-based visualiser rendering core-to-core network traffic using physical 3D tubes and directional arrowheads, alongside live message statistics and analytics overlays.

Project Layout

mpi-comm-tracker/
├── CMakeLists.txt
├── src/
│   ├── CMakeLists.txt
│   ├── mpi_communication_tracking.c
│   └── mpi_communication_tracking.h
├── tools/
│   ├── mpi_data_parser.py
│   ├── topology_generator.py
│   └── slurm_topology_generator.py
├── vis/
│   ├── index.html
│   ├── style.css
│   ├── visualiser.js
│   ├── analytics.js
│   ├── analytics-3d.js
│   └── analytics-controls.js
├── tests/
│   ├── CMakeLists.txt
│   ├── ctest_driver.py
│   ├── trace_parser.py
│   ├── test_*.c
│   └── test_*.f90
└── docs/
    └── developer-guide.md

Quick Start

1. Build

mkdir build
cd build
cmake ..
make

This produces:

build/src/libmpi_comm_tracker.so

2. Run an MPI application under the tracker

LD_PRELOAD=/path/to/build/src/libmpi_comm_tracker.so mpirun -n 16 ./your_mpi_application

At MPI_Finalize, rank 0 writes a trace file like:

your_mpi_application-YYYYMMDDHHMMSS.mpic

3. Generate or provide a hardware map

Optional, but recommended for meaningful 3D placement.

From Slurm

scontrol show topo > my_topo.txt
python tools/slurm_topology_generator.py my_topo.txt --racks_per_cab 4 --out hardware_map.json

Synthetic

python tools/topology_generator.py \
  --cabinets 2 \
  --racks 2 \
  --nodes 16 \
  --cpus 2 \
  --cores 32 \
  --system_name "My Local Cluster"

4. Parse and analyse the trace

python tools/mpi_data_parser.py your_mpi_application-YYYYMMDDHHMMSS.mpic hardware_map.json

This creates:

your_mpi_application-YYYYMMDDHHMMSS.mpix

5. Open the visualiser

Open:

vis/index.html

Then load the generated .mpix file.


What the Parser Produces

The .mpix container includes:

The analysis layer includes:


Visual Analytics

The frontend provides:


Supported MPI Calls

Point-to-point

Completion / synchronization

Collectives


Running Tests

cd build
ctest --output-on-failure

Optional Fortran test support:

cmake -S . -B build -DMPI_TRACE_FORTRAN_TESTS=AUTO
cmake -S . -B build -DMPI_TRACE_FORTRAN_TESTS=ON
cmake -S . -B build -DMPI_TRACE_FORTRAN_TESTS=OFF

Limitations


Documentation

For internal details, trace format notes, parser behaviour, frontend module layout, analytics overlays, testing, and extension guidance, see docs/developer-guide.md.

Authors

This has been developed by Adrian Jackson.

License

Apache 2.0. See LICENSE.