Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Exception: Reached maximum number of idle transformation calls. #145

Open
domainsense opened this issue Aug 18, 2023 · 11 comments
Open

Exception: Reached maximum number of idle transformation calls. #145

domainsense opened this issue Aug 18, 2023 · 11 comments

Comments

@domainsense
Copy link

domainsense commented Aug 18, 2023

Up until now, I haven't been able to solve this problem. Could you please provide me with some assistance? Thank you very much!

Is your paper(Rasul K, Seward C, Schuster I, et al. Autoregressive denoising diffusion models for multivariate probabilistic time series forecasting[C]//International Conference on Machine Learning. PMLR, 2021: 8857-8868.) obtaining the mean and variance of the predicted time through RNN, and then using DPM to acquire the distribution of predicted values? My email is [email protected]. I am greatly looking forward to receiving your assistance.

@kashif
Copy link
Collaborator

kashif commented Aug 18, 2023

can you try with the 0.7.0 branch and updated gluonts?

@domainsense
Copy link
Author

domainsense commented Aug 18, 2023 via email

@kashif
Copy link
Collaborator

kashif commented Aug 18, 2023

pip install diffusers perhaps i forgot to add it to the setup.py

@domainsense
Copy link
Author

domainsense commented Aug 18, 2023 via email

@domainsense
Copy link
Author

domainsense commented Aug 19, 2023 via email

@domainsense
Copy link
Author

domainsense commented Aug 19, 2023 via email

@xiyuanzh
Copy link

Hi @kashif, I want to use DeepVAR model which is not supported in 0.7.0 branch. If I use the default branch, I get the error of "Reached maximum number of idle transformation calls". Do you have any suggestions? Thanks a lot!

@ProRedCat
Copy link

ProRedCat commented Sep 1, 2023

Version 0.7.0 does not have the updated model creation for the TimeGrad example. You need to import a scheduler (the diffusion solver) from the diffusers library https://huggingface.co/docs/diffusers/api/schedulers/overview.

For the issue of "Reached maximum number of idle transformation calls." it's possible you do not have enough data, one solution is to just duplicate the data. Another issues is that if you're doing custom data you must ensure that the provided data is in the shape (input_size, timesteps) instead of (timesteps, input_size). It is possible that 0.7.0 has solved this issue as I have not encountered it since moving versions but I am using custom data.

Example on how to instantiate TimeGrad in 0.7.0

from diffusers import DEISMultistepScheduler

scheduler = DEISMultistepScheduler(
    num_train_timesteps=150,
    beta_end=0.1,
)

estimator = TimeGradEstimator(
    input_size=int(dataset.metadata.feat_static_cat[0].cardinality),
    hidden_size=64,
    num_layers=2,
    dropout_rate=0.1,
    lags_seq=[1],
    scheduler=scheduler,
    num_inference_steps=150,
    prediction_length=dataset.metadata.prediction_length,
    context_length=dataset.metadata.prediction_length,
    freq=dataset.metadata.freq,
    scaling="mean",
    trainer_kwargs=dict(max_epochs=200, accelerator="gpu", devices="1"),
)

@nonconvexopt
Copy link

@ProRedCat I get bad performance at version-0.7.0. Can you reproduce the similar performance recorded at timegrad-electricity notebook file?

@ProRedCat
Copy link

Could be the solver you're using, DEISMultistepSchedular is a fast ODE solver but may perform worse than some other solvers.

I was able to get a lower score on Electricity of 0.018 with the DEISMultistepSchedular but the number of epochs was set to 200. What sort of performance numbers are you getting?

@nonconvexopt
Copy link

@ProRedCat I used the DDPMScheduler but I ran only 20 epochs. That might be the reason. I will try with DEISMultistepSchedular and large epochs. Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants