Skip to content

Commit

Permalink
Silently disable default saved tensor hooks during tracing (#123196)
Browse files Browse the repository at this point in the history
Summary:
FIXES #113263. Same idea as in pytorch/pytorch#113417, but we need a more intrusive C API to silently nop default saved tensor hooks, in order to support user-code that use torch.autograd.disable_saved_tensors_hooks (see test_unpack_hooks_can_be_disabled). We mock the output of get_hooks while leaving push/pop untouched.

For compiled autograd, we're firing pack hooks once and unpack hooks twice right now, I'll look into this separately from this issue.

X-link: pytorch/pytorch#123196
Approved by: https://github.com/soulitzer

Reviewed By: izaitsevfb

Differential Revision: D58628125

Pulled By: xmfan

fbshipit-source-id: 399cba07d50610ba52da55951f7b2200eb9c217f
  • Loading branch information
xmfan authored and facebook-github-bot committed Jun 16, 2024
1 parent fc298af commit 48223b8
Showing 1 changed file with 10 additions and 0 deletions.
10 changes: 10 additions & 0 deletions userbenchmark/dynamo/dynamobench/_dynamo/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -2773,3 +2773,13 @@ def strip_color_from_string(text):
# This regular expression matches ANSI escape codes
ansi_escape = re.compile(r"\x1B[@-_][0-?]*[ -/]*[@-~]")
return ansi_escape.sub("", text)


@contextlib.contextmanager
def _disable_saved_tensors_hooks_during_tracing():
# See NOTE: [Deferring tensor pack/unpack hooks until runtime]
try:
prior = torch._C._autograd._saved_tensors_hooks_set_tracing(True)
yield
finally:
torch._C._autograd._saved_tensors_hooks_set_tracing(prior)

0 comments on commit 48223b8

Please sign in to comment.