torch.compile incorrect when imperative autograd APIs are used #91468
Labels
module: aotdispatch
umbrella label for AOTAutograd issues
module: autograd
Related to torch.autograd, and the autograd engine in general
module: correctness (silent)
issue that returns an incorrect result silently
module: pt2-dispatcher
PT2 dispatcher-related issues (e.g., aotdispatch, functionalization, faketensor, custom-op,
months
oncall: pt2
triaged
This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Issue description
torch.compile may be silently incorrect when Tensor.retain_grad or Tensor.register_hook are involved.
Code example
Example with retain_grad:
Example with register_hook:
cc @ezyang @gchanan @kadeng @albanD @gqchen @pearu @nikitaved @soulitzer @lezcano @Varal7 @msaroufim @bdhirsh @anijain2305 @chauhang @wconstab @soumith @ngimel
The text was updated successfully, but these errors were encountered: