Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why TimeStep(torch.autograd.Function) with manually designed gradients is utilized? #10

Open
Wangxi404 opened this issue Jun 4, 2024 · 0 comments

Comments

@Wangxi404
Copy link

Thanks for this outstanding work! May I ask why you utilized the class class TimeStep(torch.autograd.Function) with @staticmethod forward and backward (in cell.py)? In this class, some backward gradient is manually designed, so why not automatically differential is not applied here directly?
I tested in cell.py to replace y = TimeStep.apply(b, c, h1, h2, self.dt, self.geom.h) with y = _time_step(b, c, h1, h2, self.dt, self.geom.h) to avoid your manually designed gradients and the results are pretty similar. I am quite inserted and curious about why you coded like this?

Thanks a lot! It is really a great work!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant