-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Broadcasting is broken with LayoutTransform #4508
Comments
The LayoutTransform operator is intended not to deal with such transform, I think we need to enhance |
This is not just expand dims problem. The input shape is (1,) - if we take it from C to NCHW8c. Then , we first need to Instead, a simpler way is to prevent the insertion of expand dims and layout transforms altogether for scalars. Scalars can be easily broadcasted even if the second tensor is transformed in layouts. The above PR does just that. |
@pyalex This is fixed. Can you please check and close the issue? Thanks! |
Problem solved! Thanks |
When using using optimization level >= 3 - LayoutTransform fails on converting C -> NCHW8c when input shape is (1,) and that breaks next operation which suppose to broadcast this input.
Code to reproduce. It works with opt_level < 3
Compiled program with issue
Full Stack Trace
The text was updated successfully, but these errors were encountered: