This is the official repository for the paper:
EMForecaster is a novel deep learning architecture specialized for time series forecasting, benchmarked primarily on electromagnetic field (EMF) exposure forecasting. EMForecaster also includes a conformal prediction pipeline for uncertainty quantification, along with a trade-off score metric which we propose a unified measure of model performance to balance width of prediction intervals (minimizing) and empirical coverage (maximizing).
- Python
$\geq$ 3.10
# Create and activate conda environment
conda create -n emforecaster python=3.10
conda activate emforecaster
# Install requirements
pip install -e .
# Create and activate virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install requirements
pip install -e .
Data is proprietary and provided by primarily by Luca Chiaraviglio, please contact him for access.
Execute the main script with the desired model:
python main.py <model>
Results are saved in the logs
folder if using offline logging. To run custom experiments, run emforecaster/jobs/exp/<model>/args.yaml
for each model, with respect to the classes in config.py
.
Hyperparameter tuning is also available, by modifying emforecaster/jobs/exp/<model>/ablation.yaml
, which performs grid search on a set of parameters. To run hyperparameter tuning jobs use:
python /tuning/tune.py <model>
This project allows for online experimental logging via Neptune.ai, which is free for researchers and students.
- Create a Neptune.ai account and API token
- Set your Neptune API token as an environment variable:
export NEPTUNE_API_TOKEN='your-neptune-api-token'
- Set
neptune: True
in yourargs.yaml
under theexp
category. See other parameters such asrun_id
andexp_id
.
The get_results.py
reads your <model>.yaml
file in the emforecaster/analysis/neptune/ablations
directory, which iterates through all model runs (e.g., from a hyperparameter run) to display the best result with respect to a deciding_metric
(e.g., MSE). All model runs and their model configurations are displayed and saved as results.csv
.
This project is licensed under the MIT License - see the LICENSE file for details.
If you use this code in your research or work, please cite our paper:
@article{mootoo2025emforecaster,
title = {EMForecaster: A Deep Learning Framework for Time Series Forecasting in Wireless Networks with Distribution-Free Uncertainty Quantification},
author = {Mootoo, Xavier and Chiaraviglio, Luca and Tabassum, Hina},
year = {2025},
}
For queries, please contact the corresponding author through: xmootoo at gmail dot com
.
Xavier Mootoo is supported by Canada Graduate Scholarships - Master's (CGS-M) funded by the Natural Sciences and Engineering Research Council (NSERC) of Canada, the Vector Scholarship in Artificial Intelligence, provided through the Vector Institute, Canada, and the Ontario Graduate Scholarship (OGS) granted by the provincial government of Ontario, Canada.
We extend our gratitude to Commune AI for generously providing the computational resources needed to carry out our experiments, in particular, we thank Luca Vivona (@LVivona) and Sal Vivona (@salvivona).