Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

onnx转tmfile后为什么LSTM被单独分出? #1171

Open
mahuixian opened this issue Sep 30, 2021 · 2 comments
Open

onnx转tmfile后为什么LSTM被单独分出? #1171

mahuixian opened this issue Sep 30, 2021 · 2 comments

Comments

@mahuixian
Copy link

mahuixian commented Sep 30, 2021

1632984605(1)

onnx转tmfile时到LSTM时为什么会把LSTM单独分出?
crnn模型在prerun时到全连接层推理出的input tensor是1,25,256,模型的正常输入应该是1,25,512,为什么会出现这样的情况?

@mahuixian
Copy link
Author

LSTM只支持单向,所以输出少了个num_directions,导致后面模型推理有问题

@SuSung-boy
Copy link

你好,请问你解决了吗,我这边单项的LSTM也是会被单独分出来

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants