Skip to content

Commit

Permalink
enable fx graph cache on torchbench (#128239)
Browse files Browse the repository at this point in the history
Summary:
We've already enabled for timm and huggingface, but we had failures saving cache entries for moco. It looks like pytorch/pytorch#128052 has fixed that issue, so we can enable for torchbench.

X-link: pytorch/pytorch#128239
Approved by: https://github.com/oulgen

Reviewed By: clee2000

Differential Revision: D58394057

Pulled By: masnesral

fbshipit-source-id: 58a226babdcd04a3bb14118e7a555558c56bc9ef
  • Loading branch information
masnesral authored and facebook-github-bot committed Jun 11, 2024
1 parent 417731f commit 10801fb
Showing 1 changed file with 4 additions and 0 deletions.
4 changes: 4 additions & 0 deletions userbenchmark/dynamo/dynamobench/torchbench.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,10 @@
# We are primarily interested in tf32 datatype
torch.backends.cuda.matmul.allow_tf32 = True

# Enable FX graph caching
if "TORCHINDUCTOR_FX_GRAPH_CACHE" not in os.environ:
torch._inductor.config.fx_graph_cache = True


def _reassign_parameters(model):
# torch_geometric models register parameter as tensors due to
Expand Down

0 comments on commit 10801fb

Please sign in to comment.