Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[llama] Dont fail on training with random weights #1631

Closed
wants to merge 1 commit into from
Closed

Conversation

anijain2305
Copy link
Contributor

@anijain2305 anijain2305 commented May 10, 2023

Enables testing torch.compile with random weights.

Copy link
Contributor

@xuzhao9 xuzhao9 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks for the fix!

@facebook-github-bot
Copy link
Contributor

@anijain2305 has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@anijain2305 merged this pull request in ea0bdfe.

xuzhao9 pushed a commit that referenced this pull request May 11, 2023
Summary:
Enables testing torch.compile with random weights.

Pull Request resolved: #1631

Reviewed By: xuzhao9

Differential Revision: D45718801

Pulled By: anijain2305

fbshipit-source-id: 44c7ebba7c292dd298b66009b0e7fd6deee6d08c
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants