-
Notifications
You must be signed in to change notification settings - Fork 993
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
onnx转tmfile后为什么LSTM被单独分出? #1171
Comments
LSTM只支持单向,所以输出少了个num_directions,导致后面模型推理有问题 |
你好,请问你解决了吗,我这边单项的LSTM也是会被单独分出来 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
onnx转tmfile时到LSTM时为什么会把LSTM单独分出?
crnn模型在prerun时到全连接层推理出的input tensor是1,25,256,模型的正常输入应该是1,25,512,为什么会出现这样的情况?
The text was updated successfully, but these errors were encountered: