Skip to content

Latest commit

 

History

History
71 lines (55 loc) · 2.34 KB

Readme.md

File metadata and controls

71 lines (55 loc) · 2.34 KB

Codebase for Hypnos

Folders

Dataset

  • The folder containing the datasets needs to be populated first by downloading the datasets; see point about Datasets below.

Fig

  • Figures used in the paper, generated with this codebase

Lib

  • dataset_starter.py: helper file to parallelize the creation of the TMs
  • eval_starter.py: helper file to parallelize the evaluation of the algorithm
  • lib.py: file containing some functions used for the TM creation and evaluation of the algorithm
  • rep_lib.py: library specific to the evaluation of the Repetita datset
  • tomogravity.py: file containing the tomogravity method to get from link loads to TMs
  • create_plots.ipynb: jupyter notebook that creates the plots used in the paper from the results

Repetita

  • repetita.ipynb: evaluates the algorithm on the repetita dataset

Surfnet

  • create_dataset_surf.py: loads topology and link loads from dataset to create a networkx graph and the TM
  • eval_surf.py: loads networkx graph and TM, evaluates the sleep algorithm and saves the results

Switch

  • create_dataset_switch.py: loads topology and link loads from dataset to create a networkx graph and the TM
  • eval_switch.py: loads networkx graph and TM, evaluates the sleep algorithm and saves the results

TM

  • Folder where the scripts will save the traffic matrix

Results

  • Folder where the scripts will save the results from the eval

Python

Python Version:

3.10.12

Used packages:

numpy==1.26.4
networkx==3.2.1
torch==2.3.1
pandas==2.2.1 
plotly==5.20.0

Datasets

Repetita

You can find the dataset under https://github.com/svissicchio/Repetita.

SWITCH LAN & SURFnet

https://doi.org/10.5281/zenodo.12580396

Your own dataset

Might be necessary to write your own function that loads the dataset depending on the format.

You should be able to reuse the rest of the functions as long as you can create a networkx MultiDiGraph containing the following fields for each link. Fields:

key:     tuple containing (<source interface name>, <destination interface name>)
util:    link load in Gbit/s
avail:   difference between util and max_bw
max_bw:  link capacity in Gbit/s
weight:  routing weight/metric

You also most likely need to adapt the paths in the scripts to point to the correct folders