-
Notifications
You must be signed in to change notification settings - Fork 21.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add hooks for execution on intel gaudi devices - 1 #128584
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/128584
Note: Links to docs will display an error until the docs builds have been completed. ✅ You can merge normally! (2 Unrelated Failures)As of commit c397cd1 with merge base 91a8376 (): FLAKY - The following jobs failed but were likely due to flakiness present on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
@albanD : Kindly help with the review and merge |
@pytorchbot rebase |
@pytorchbot started a rebase job onto refs/remotes/origin/viable/strict. Check the current status here |
Successfully rebased |
b4d5e0b
to
b9b7132
Compare
b9b7132
to
673a20a
Compare
@albanD : could you kindly help with the review and merge. Thank you. |
@albanD : gentle reminder, could you please help with this PR, Thanks. |
4ce9bd0
to
c397cd1
Compare
@albanD : can you please help with the review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is most likely to be brittle given that you don't have CI signal for it.
tbh if your CI cannot live in PT, I think a good long term plan is for the hpu CI spec to live where the CI runs. There you will be able to pin PyTorch and move forward carefully while always having a known good version.
But I guess that now that you have this scaffolding, you can already update these attributes on the OpInfos directly from your own repo before running the tests?
@albanD : thanks for the approval, yes that's the goal , after this change is in, we will update opinfo data with gaudi op capabilities, that way we ensure its clean. |
@pytorchbot merge |
Merge failedReason: This PR needs a If not, please add the To add a label, you can comment to pytorchbot, for example For more information, see Details for Dev Infra teamRaised by workflow job |
@pytorchbot label "topic: not user facing" |
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
The merge job was canceled or timed out. This most often happen if two merge requests were issued for the same PR, or if merge job was waiting for more than 6 hours for tests to finish. In later case, please do not hesitate to reissue the merge command |
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
## Motivation This is follow up to PR:pytorch#126970 to support Gaudi devices for Pytorch UT execution. ## Changes We are adding additional hooks to: 1. Add dtype exceptions for Gaudi/HPU 2. Extend onlyNativeDevices decorator functionality to add additional devices Pull Request resolved: pytorch#128584 Approved by: https://github.com/albanD
## Motivation This is follow up to PR:pytorch#126970 to support Gaudi devices for Pytorch UT execution. ## Changes We are adding additional hooks to: 1. Add dtype exceptions for Gaudi/HPU 2. Extend onlyNativeDevices decorator functionality to add additional devices Pull Request resolved: pytorch#128584 Approved by: https://github.com/albanD
Motivation
This is follow up to PR:#126970 to support Gaudi devices for Pytorch UT execution.
Changes
We are adding additional hooks to: