[KDD-24] ImputeFormer: Low Rankness-Induced Transformers for Generalizable Spatiotemporal Imputation
Our official implementation based on Torch Spatiotemporal will also be made available soon!!
Our motivation: (a) The distribution of singular values in spatiotemporal data is long-tailed. The existence of missing data can increase its rank (or singular values). (b) Low-rank models can filter out informative signals and generate a smooth reconstruction, resulting in truncating too much energy in the left part of its spectrum. (c) Deep models can preserve high-frequency noise and generate sharp imputations, maintaining too much energy for the right part of the singular spectrum. With the generality of low-rank models and the expressivity of deep models, ImputeFormer achieves a signal-noise balance for accurate imputation.
The directory is structured as follows:
.
├── config/
│ ├── imputation/
│ │── Imputeformer.yaml
│ │── brits.yaml
│ │── grin.yaml
│ │── saits.yaml
│ │── spin.yaml
│ └── transformer.yaml
├── experiments/
│ └── run_imputation.py
├── Imputeformer/
│ ├── baselines/
│ ├── imputers/
│ ├── layers/
│ ├── models/
│ └── ...
├── conda_env.yaml
└── tsl_config.yaml
Following the instructions in SPIN and tsl, the project dependencies can be installed:
conda env create -f conda_env.yml
conda activate imputeformer
The experiment scripts are in the experiments
folder.
-
run_imputation.py
is used to run models including both ImputeFormer and baselines. An example of usage isconda activate imputeformer python ./experiments/run_imputation.py --config imputation/imputeformer_la.yaml --model-name imputeformer --dataset-name la_block
-
run_inference.py
is used for inference only using pre-trained models. An example of usage isconda activate imputeformer python ./experiments/run_inference.py --config inference.yaml --model-name imputeformer --dataset-name la_point --exp-name {exp_name}
Due to the minor change of ImputeFormer's title, you can simply search for "ImputeFormer" in Google Scholar to get our latest version. Citation information is automatically updated as the proceedings become available.
If you find this code useful please consider to cite our paper:
@article{nie2023imputeformer,
title={ImputeFormer: Low Rankness-Induced Transformers for Generalizable Spatiotemporal Imputation},
author={Nie, Tong and Qin, Guoyang and Ma, Wei and Mei, Yuewen and Sun, Jian},
journal={arXiv preprint arXiv:2312.01728},
year={2023}
}
We acknowledge SPIN for providing a useful benchmark tool and training pipeline and TorchSpatiotemporal for helpful model implementations.