Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[aotinductor] Only autotune at compile time when enabled via config #129335

Closed
wants to merge 1 commit into from

Conversation

ColinPeppler
Copy link
Contributor

@ColinPeppler ColinPeppler commented Jun 23, 2024

Copy link

pytorch-bot bot commented Jun 23, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/129335

Note: Links to docs will display an error until the docs builds have been completed.

❌ 16 New Failures, 1 Cancelled Job, 34 Unrelated Failures

As of commit defde8d with merge base 1a54bb0 (image):

NEW FAILURES - The following jobs have failed:

CANCELLED JOB - The following job was cancelled. Please retry:

FLAKY - The following job failed but was likely due to flakiness present on trunk:

UNSTABLE - The following jobs failed but were likely due to flakiness present on trunk and has been marked as unstable:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58931057

@@ -1488,7 +1488,7 @@ def wrap_arg(arg):
elif isinstance(arg, (int, float, bool, SymbolicCallArg)):
return str(arg)
else:
return pexpr(V.graph.sizevars.simplify(arg))
return self.expr_printer(V.graph.sizevars.simplify(arg))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will this fix the internal model failure you pointed out in D58931057?

@@ -1300,7 +1300,7 @@ def compile_fx(
with config.patch(
{
"cpp_wrapper": False,
"triton.autotune_at_compile_time": True,
"triton.autotune_at_compile_time": config.triton.autotune_at_compile_time,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you also update here, https://github.com/pytorch/pytorch/blob/main/torch/_inductor/config.py#L700, to make the default value to 1? I wanted to keep the always on unless it breaks anything beyond the current test coverage. Not sure if your change to wrapper.py fixes the underlying problem. If yes, you can make the default value of TORCHINDUCTOR_TRITON_AUTOTUNE_AT_COMPILE_TIME always to 1; otherwise, you can make the default of TORCHINDUCTOR_TRITON_AUTOTUNE_AT_COMPILE_TIME different between fbcode and OSS.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Making fbcode and OSS default difference since the issue still exists

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, I was wrong. Changing _inductor/config.py will change the behavior of JIT Inductor. We can only change it here.

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Jun 24, 2024
@ColinPeppler ColinPeppler added the topic: not user facing topic category label Jun 24, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58931057

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58931057

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58931057

ColinPeppler added a commit to ColinPeppler/pytorch that referenced this pull request Jun 24, 2024
…ytorch#129335)

Summary:
Pull Request resolved: pytorch#129335

 ---

Test Plan: ci

Reviewed By: jingsh, desertfire

Differential Revision: D58931057
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58931057

…ytorch#129335)

Summary:
Pull Request resolved: pytorch#129335

 ---

Test Plan: ci

Reviewed By: jingsh, desertfire, houseroad

Differential Revision: D58931057
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58931057

@facebook-github-bot
Copy link
Contributor

@pytorchbot merge

(Initiating merge automatically since Phabricator Diff has merged)

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@pytorchmergebot
Copy link
Collaborator

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants