Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Fix reverse shape inference in LayerNorm #17683

Merged
merged 2 commits into from
Feb 25, 2020
Merged

Conversation

sxjscience
Copy link
Member

Fix #17654. Now, the user will see an error message if the in_channels does not match with the corresponding dimension in the input

After the PR, the following will raise an error

import mxnet as mx
from mxnet.gluon import nn
net = nn.LayerNorm(in_channels=10)
net.initialize()
net.hybridize()
out = net(mx.nd.ones((2, 11)))  # Trigger the error

Error:

MXNetError: MXNetError: Error in operator layernorm0_layernorm0: Shape inconsistent, Provided = [10], inferred shape=[11]

add test case for error checking
@sxjscience
Copy link
Member Author

@zheyuye

@leezu leezu merged commit ce8a616 into apache:master Feb 25, 2020
@sxjscience sxjscience deleted the layer_norm_fix branch February 25, 2020 21:41
anirudh2290 pushed a commit to anirudh2290/mxnet that referenced this pull request May 29, 2020
* Update layer_norm.cc

add test case for error checking

* fix indent
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[LayerNorm] Missing the mismatch cues of in_channels
2 participants