Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enhancing Covariate Conditioning in TimeGrad #149

Open
ProRedCat opened this issue Oct 11, 2023 · 1 comment
Open

Enhancing Covariate Conditioning in TimeGrad #149

ProRedCat opened this issue Oct 11, 2023 · 1 comment

Comments

@ProRedCat
Copy link

Currently, in the TimeGrad model, experimental results reveal suboptimal adherence to control inputs when using covariate conditioning. This issue stems from the dominance of previous sequences within the context window over the autoregressive component, resulting in a stronger influence of previous inputs compared to the desired control inputs.

In the TimeGrad paper, both covariates and previous sequences are fed into the RNN, which the model is then conditioned on. I propose a modification where the conditioning information is separated from the input data and concatenated with the autoregressive output for conditioning.

Additionally, introducing per-frame dropout before input into the RNN can help reduce the model's reliance on past sequences, thereby allowing for more precise control input conditioning.

If there are any questions or concerns about the proposed solution I'm happy to chat about it.

@coding-loong
Copy link

The TimeGrad predicts the future in an autoregressive way. I did a work to replace it with a method similar to TCN. However, the prediction performance on solar and taxi datasets is much worse than that of TimeGrad . I guess it's because my method doesn't use a covariate condition. But no matter how much I incorporated covariates into my method, the prediction did not improve. If possible, we can explore the impact of covariates on prediction performance in a non-autoregressive manner.
By the way, TimeGrad not only extracts the features of historical sequences, but also extracts the features of future prediction sequences to generate conditional embeddings. In a non-autoregressive way, how can this be adjusted? This may also be the reason for the lack of performance of my method.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants