This repository contains the dataset used for the paper "Network Calculus with Flow Prolongation - A Feedforward FIFO Analysis enabled by ML" published in IEEE Transactions on Computers. We refer to the article for a full explanation of the methodology used for generating the dataset.
This is an extension of the DeepFP method initially presented in "Tightening Network Calculus Delay Bounds by Predicting Flow Prolongations in the FIFO Analysis" presented at IEEE RTAS 2021. Compared to the dataset and method previously used, we use reinforcement learning to train the neural network used in DeepFP. Hence our training dataset does not contain information about the prolongations.
If you use this dataset for your research, please include the following reference in any resulting publication:
@article{GeyerSchefflerBondorf_TC2022,
author = {Geyer, Fabien and Scheffler, Alexander and Bondorf, Steffen},
journal = {IEEE Transactions on Computers},
title = {Network Calculus with Flow Prolongation - A Feedforward {FIFO} Analysis enabled by {ML}},
year = {2022},
doi = {10.1109/TC.2022.3204225},
}
The dataset contained in this repository is comprised of three files:
dataset-train.pbz
is the dataset used for training the graph neural network used in DeepFP,dataset-evaluation-small.pbz
is the dataset with small networks (up to 40 flows and 15 servers) used for the evaluation in Section 6,dataset-evaluation-large.pbz
is the dataset with large networks (more than 100 flows and up to 30 servers) used for the evaluation in Section 6.
Additionally, dataset_structure.proto
details the datastructure used for the dataset.
The dataset is stored as serialized protobuf messages using the PBZ file format.
The pbzlib
library can be used for processing this file format, available for different programming languages.
This repository contains an example python script for parsing the files. To get it and execute it:
$ git clone https://github.com/fabgeyer/dataset-deepfp-extension.git
$ cd dataset-deepfp-extension.git
$ pip3 install -r requirements.txt
$ python3 example.py dataset-train.pbz
The paper contains a motivating example showing potential effectiveness on a sample network topology and one potential prolongation, depending on the bottleneck utilization.
The network topology is depicted in Figure 2 in the paper.
Curve shapes are given in Sec. 4-1 as
Curve and utilization parameters as well as delay and output burstiness bounds are given in these two CSV files in folder motivating_FP_example_data_Fig_2
:
fp_sensitivity_delay_bound.csv
fp_sensitivity_output_burstiness_bound.csv
Two metrics were computed, the second of which is plotted in the paper:
-
improvement_overall[%]
according to the Equation 20 in the paper (yet for delay and burstiness). -
improvement_variable_part[%]
exploits the possibility to easily remove the inital and invariant latency or burstiness of the curves. I.e., it shows the improvement in the variable part of the bounds. It is computed by removing$4T$ or$b$ , respectively, from the divisor.
The data in this repository is licensed under Creative Commons Attribution-ShareAlike 4.0 International (CC BY-SA 4.0).