Official Implementation of the paper "Denoising Architecture for Unsupervised Anomaly Detection in Time-Series"
This repository contains the official implementation of the paper "Denoising Architecture for Unsupervised Anomaly Detection in Time-Series" by Wadie Skaf and Tomáš Horváth.
- Springer: https://link.springer.com/chapter/10.1007/978-3-031-15743-1_17
- ArXiv preprint: https://arxiv.org/abs/2208.14337
Please check the requirements.txt
file for the required packages.
The dataset used in this paper is the Yahoo S5 dataset, which can be requested and downloaded from here. The dataset should be placed in the Datasets
folder.
Define the seq_len and the architecture parameters in build_experiments_file.py
and run it to generate the experiments file.
- Check the
exps.json
file and make sure the experiments are defined as you wish. - Run
python experiments.py
to run the experiments. - The results will be stored in
experiments_results
folder. Please refer to theexperiments.py
file for more details. - In case the CSV files are messed due to storing the architectures as lists, you can use the
fix_csv_files.py
file to fix them.
If you find this code useful, please cite our paper:
@InProceedings{skaf_2022_denoising,
author="Skaf, Wadie
and Horv{\'a}th, Tom{\'a}{\v{s}}",
editor="Chiusano, Silvia
and Cerquitelli, Tania
and Wrembel, Robert
and N{\o}rv{\aa}g, Kjetil
and Catania, Barbara
and Vargas-Solar, Genoveva
and Zumpano, Ester",
title="Denoising Architecture for Unsupervised Anomaly Detection in Time-Series",
booktitle="New Trends in Database and Information Systems",
year="2022",
publisher="Springer International Publishing",
address="Cham",
pages="178--187",
isbn="978-3-031-15743-1"
}
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
The last modification to this code was made on 28-12-2021, and it is not maintained anymore. So, it might be the case of having some issues with the latest versions of the packages used in this code. Please feel free to contact me in case you have any questions.