Giter VIP home page Giter VIP logo

deeptime's Introduction

Learning Deep Time-index Models for Time Series Forecasting (ICML 2023)



Figure 1. Overall approach of DeepTime.

Official PyTorch code repository for the DeepTime paper. Check out our blog post!

  • DeepTime is a deep time-index based model trained via a meta-optimization formulation, yielding a strong method for time-series forecasting.
  • Experiments on real world datases in the long sequence time-series forecasting setting demonstrates that DeepTime achieves competitive results with state-of-the-art methods and is highly efficient.

Requirements

Dependencies for this project can be installed by:

pip install -r requirements.txt

Quick Start

Data

To get started, you will need to download the datasets as described in our paper:

  • Pre-processed datasets can be downloaded from the following links, Tsinghua Cloud or Google Drive, as obtained from Autoformer's GitHub repository.
  • Place the downloaded datasets into the storage/datasets/ folder, e.g. storage/datasets/ETT-small/ETTm2.csv.

Reproducing Experiment Results

We provide some scripts to quickly reproduce the results reported in our paper. There are two options, to run the full hyperparameter search, or to directly run the experiments with hyperparameters provided in the configuration files.

Option A: Run the full hyperparameter search.

  1. Run the following command to generate the experiments: make build-all path=experiments/configs/hp_search.
  2. Run the following script to perform training and evaluation: ./run_hp_search.sh (you may need to run chmod u+x run_hp_search.sh first).

Option B: Directly run the experiments with hyperparameters provided in the configuration files.

  1. Run the following command to generate the experiments: make build-all path=experiments/configs/ETTm2.
  2. Run the following script to perform training and evaluation: ./run.sh (you may need to run chmod u+x run.sh first).

Finally, results can be viewed on tensorboard by running tensorboard --logdir storage/experiments/, or in the storage/experiments/experiment_name/metrics.npy file.

Main Results

We conduct extensive experiments on both synthetic and real world datasets, showing that DeepTime has extremely competitive performance, achieving state-of-the-art results on 20 out of 24 settings for the multivariate forecasting benchmark based on MSE.



Detailed Usage

Further details of the code repository can be found here. The codebase is structured to generate experiments from a .gin configuration file based on the build.variables_dict argument.

  1. First, build the experiment from a config file. We provide 2 ways to build an experiment.
    1. Build a single config file:
      make build config=experiments/configs/folder_name/file_name.gin
      
    2. Build a group of config files:
      make build-all path=experiments/configs/folder_name
  2. Next, run the experiment using the following command
    python -m experiments.forecast --config_path=storage/experiments/experiment_name/config.gin run
    Alternatively, the first step generates a command file found in storage/experiments/experiment_name/command, which you can use by the following command,
    make run command=storage/experiments/experiment_name/command
  3. Finally, you can observe the results on tensorboard
    tensorboard --logdir storage/experiments/
    or view the storage/experiments/deeptime/experiment_name/metrics.npy file.

Acknowledgements

The implementation of DeepTime relies on resources from the following codebases and repositories, we thank the original authors for open-sourcing their work.

Citation

Please consider citing if you find this code useful to your research.

@InProceedings{pmlr-v202-woo23b,
  title = 	 {Learning Deep Time-index Models for Time Series Forecasting},
  author =       {Woo, Gerald and Liu, Chenghao and Sahoo, Doyen and Kumar, Akshat and Hoi, Steven},
  booktitle = 	 {Proceedings of the 40th International Conference on Machine Learning},
  pages = 	 {37217--37237},
  year = 	 {2023},
  editor = 	 {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan},
  volume = 	 {202},
  series = 	 {Proceedings of Machine Learning Research},
  month = 	 {23--29 Jul},
  publisher =    {PMLR},
  pdf = 	 {https://proceedings.mlr.press/v202/woo23b/woo23b.pdf},
  url = 	 {https://proceedings.mlr.press/v202/woo23b.html}
}

deeptime's People

Contributors

dependabot[bot] avatar gorold avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

deeptime's Issues

Explanation of equation 1 and 2 in the paper

Hello,

If possible, can you please explain equations 1 and 2 in greater detail?.

A) what does super script * signify?
B) Are φ and \theta neural networks?

I just want to understand the role of these equations. Thank you.

image

How to run your code in Windows

hello,
My computer system is Windows. There are always errors when I run make. Can I write a method to run it directly in python xx.py without using make.

For example, when I run the code make build-all path=experiments/configs/ETTm2, I get the following error:

/usr/bin/sh: line 1: make: command not found
/usr/bin/sh: line 1: make: command not found
/usr/bin/sh: line 1: make: command not found
/usr/bin/sh: line 1: make: command not found
/usr/bin/sh: line 1: make: command not found
/usr/bin/sh: line 1: make: command not found
/usr/bin/sh: line 1: make: command not found
/usr/bin/sh: line 1: make: command not found
make: *** [Makefile:7: build-all] Error 127

In addition, I tried various methods to install make, but they couldn't solve this problem.(it may be my computer problem)

Thank you.

scikit-learn API support

model = DeepTIMe(....
model.fit(X, y)
y_pred = model.predict(..

See XGBoost as an example.

Thanks

Sincerely consult

Dear author, thank you for your great work! I have two questions.

  1. What is the meaning of "96,192,336,720" of column "Metrics" in Table 1 in your paper?

2)I try to reproduce the experimental results through Option B. However, I get this error. Could you give me some advice?
截屏2023-09-15 11 15 02

Thank you very much! I really appreciate it!

Makefile usage

Hello,

Thank you for releasing the code.
I was able to recreate the results.

I am new to shell and makefile
So, While going through the code I had a few doubts about Makefile and the shell files(run. sh)

In Makefile

build-all: .require-path
	for config in $(shell ls ${path}/*.gin); do \
      make build config=$$config; \
    done
  1. What is the initial value of config in the for loop?
build: .require-config
	python -m experiments.forecast --config_path=${config} build_experiment
  1. What is happening in the above script?

  2. While running make command i saw that the config files are again saved under storage/experiments where exactly is this mentioned in code?

In run.sh

for dataset in ECL ETTm2 Exchange ILI Traffic Weather; do
  for instance in `/bin/ls -d storage/experiments/$dataset/*/*`; do
      echo $instance
      make run command=${instance}/command
  done
done
  1. Here if the outer loop is used to move from one dataset to another(ECL, ETTm2,..), what is dataset, is it file name or variable?
  2. In the inner loop what does the line make run command=${instance}/command do?

Please let me know.
Thanks

Regards
Niharika

Notebook

Can you release a example Jupiter Notebook with out of sample predictions for air passengers dataset? Would make trying out a lot easier. Did you compare your model to a structural time series with tensorflow - TFP?

Did I get it all right - DeepTime is a states space model with a neural network for shifting distribuitions that are learned from an ensemble with metalearning?

Could N-HiTS be improved with your index based approach instead of the historic data style that is used now?

def __getitem__(self, idx: int) -> Tuple[np.ndarray, np.ndarray, np.ndarray, np.ndarray]:

def __getitem__(self, idx: int) -> Tuple[np.ndarray, np.ndarray, np.ndarray, np.ndarray]:
    if self.time_features:
        dim_idx = idx // self.n_time_samples
        dim_slice = slice(dim_idx, dim_idx + 1)
        idx = idx % self.n_time_samples
    else:
        dim_slice = slice(None)

    x_start = idx
    x_end = x_start + self.lookback_len
    y_start = x_end - self.lookback_aux_len
    y_end = y_start + self.lookback_aux_len + self.horizon_len

    x = self.data_x[x_start:x_end, dim_slice]
    y = self.data_y[y_start:y_end, dim_slice]
    x_time = self.timestamps[x_start:x_end]
    y_time = self.timestamps[y_start:y_end]

    return x, y, x_time, y_time
    
    
    Hello, thank you for sharing the code. I would like to ask why this place retrieves data in this way

123

How to make predictions after the model is trained

Hello, thank you for sharing the code. I have a question. Can this model be predicted based on only one time feature after training? How can it be predicted? Is there a file for the prediction method?

Denormalize preds and trues

Hello,
Thankyou for releasing the code

I am trying to plot the trues and preds of the test data,
But the values normalized
So, I tried to denormalize the preds and trues using inverse_transform of the standard scaler,
But the values were not denormalized, is there any other way to denormalize the preds and trues?

Can you please help with this issue

Thank you
Niharika Joshi

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.