-
Notifications
You must be signed in to change notification settings - Fork 191
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enhancing Covariate Conditioning in TimeGrad #149
Comments
The TimeGrad predicts the future in an autoregressive way. I did a work to replace it with a method similar to TCN. However, the prediction performance on solar and taxi datasets is much worse than that of TimeGrad . I guess it's because my method doesn't use a covariate condition. But no matter how much I incorporated covariates into my method, the prediction did not improve. If possible, we can explore the impact of covariates on prediction performance in a non-autoregressive manner. |
Currently, in the TimeGrad model, experimental results reveal suboptimal adherence to control inputs when using covariate conditioning. This issue stems from the dominance of previous sequences within the context window over the autoregressive component, resulting in a stronger influence of previous inputs compared to the desired control inputs.
In the TimeGrad paper, both covariates and previous sequences are fed into the RNN, which the model is then conditioned on. I propose a modification where the conditioning information is separated from the input data and concatenated with the autoregressive output for conditioning.
Additionally, introducing per-frame dropout before input into the RNN can help reduce the model's reliance on past sequences, thereby allowing for more precise control input conditioning.
If there are any questions or concerns about the proposed solution I'm happy to chat about it.
The text was updated successfully, but these errors were encountered: