Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Frontend][Paddle] [PaddlePaddle Hackathon 4]add attribute support for dropout/hard_sigmoid/pixel_shuffle #14575

Merged
merged 10 commits into from
Apr 12, 2023

Conversation

MayYouBeProsperous
Copy link
Contributor

@MayYouBeProsperous MayYouBeProsperous commented Apr 11, 2023

add dropout_prob and dropout_implementation attributes for dropout op.
add offset attribute for hard_sigmoid op.
add data_format attribute for pixel_shuffle op.

@tvm-bot
Copy link
Collaborator

tvm-bot commented Apr 11, 2023

Thanks for contributing to TVM! Please refer to the contributing guidelines https://tvm.apache.org/docs/contribute/ for useful information and tips. Please request code reviews from Reviewers by @-ing them in a comment.

Generated by tvm-bot

@MayYouBeProsperous MayYouBeProsperous changed the title [Frontend][Paddle] [PaddlePaddle Hackathon 4]add attribute support for some ops [Frontend][Paddle] [PaddlePaddle Hackathon 4]add attribute support for dropout/gelu/hard_sigmoid/pixel_shuffle Apr 11, 2023
g.add_node(op.output("Out")[0], x)
dropout_prob = op.attr("dropout_prob")
out = _op.nn.dropout(x, dropout_prob)
g.add_node(op.output("Out")[0], out)
Copy link
Contributor

@jiangjiajun jiangjiajun Apr 12, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

_expr.const(0.5, dtype="float32")
+ _op.erf(x * _expr.const(0.5**0.5, dtype="float32")) * _expr.const(0.5, dtype="float32")
)
approximate = op.attr("approximate")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's no need to implement approximate strategy, it's just a strategy to boost computation in Paddle

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

@MayYouBeProsperous MayYouBeProsperous changed the title [Frontend][Paddle] [PaddlePaddle Hackathon 4]add attribute support for dropout/gelu/hard_sigmoid/pixel_shuffle [Frontend][Paddle] [PaddlePaddle Hackathon 4]add attribute support for dropout/hard_sigmoid/pixel_shuffle Apr 12, 2023
Copy link
Contributor

@jiangjiajun jiangjiajun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM @junrushao

@masahi masahi merged commit 606e2b7 into apache:main Apr 12, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants