Skip to content

p-r-a-v-i-n/perftrack

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PerfTrack

PerfTrack is a command-line tool that helps detect performance regressions in open-source projects and CI pipelines. It requires no external services and tracks performance locally.

Designed for Python projects (Django, Pandas, FastAPI, ML libraries), internal tooling, and personal benchmarking workflows

Why PerfTrack?

Performance regressions often go unnoticed because:

  • CI timing is noisy and unreliable for benchmarking
  • Tools like ASV or pytest-benchmark require setup and maintenance
  • There is no simple utility to store a baseline and fail CI when performance drops

PerfTrack stores local performance snapshots and compares them against future runs, both locally and in CI.

What It Measures

  • Wall-clock time
  • CPU time
  • Peak RSS memory usage

Baseline cached locally → regression check in CI.

Installation

pip install perftrack

Quick start

# run a command and record performance
perftrack run "python script.py"

# set the latest run as baseline
perftrack baseline set-latest

# later compare new results with baseline
perftrack compare --fail-on-regression

# Generate Simple HTML report
perftrack report

Replace "python script.py" with whatever you want to benchmark (scripts, test suites, build steps, ML training, etc.)

Directory

Perftrack store results under:

.perftrack/
  baseline.json
  latest.json

CI Regression Example

PerfTrack guarantees that performance regressions are caught before merge. Here is a (pull request) where we intentionally slowed down code to demonstrate PerfTrack in CI.

perftrack

About

Lightweight Performance Regression Tracker

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages