Skip to content

Commit

Permalink
Unspec nn module when global backward hooks are present (#127802)
Browse files Browse the repository at this point in the history
Summary:
X-link: pytorch/pytorch#127802
Approved by: https://github.com/jansel
ghstack dependencies: #127785

Reviewed By: atalman

Differential Revision: D58168157

Pulled By: anijain2305

fbshipit-source-id: ab9bf505439b5df39a2c45447427744db16028a6
  • Loading branch information
anijain2305 authored and facebook-github-bot committed Jun 5, 2024
1 parent b5ab0fe commit 6a02e0c
Showing 1 changed file with 8 additions and 0 deletions.
8 changes: 8 additions & 0 deletions userbenchmark/dynamo/dynamobench/_dynamo/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -2106,6 +2106,14 @@ def format_bytecode(prefix, name, filename, line_no, code):
all_hook_names = forward_hook_names + backward_hook_names + state_dict_hook_names


def nn_module_has_global_hooks():
# This is limited to backward hooks for now because NNModuleVariable
# supports fwd hooks underneath.
return len(torch.nn.modules.module._global_backward_hooks) or len(
torch.nn.modules.module._global_backward_pre_hooks
)


def nn_module_get_all_hooks(
mod,
check_forward_hooks=False,
Expand Down

0 comments on commit 6a02e0c

Please sign in to comment.