-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Convert to onnx] #76
Comments
Hello, has your problem been solved? |
Hi any news? |
Hey, did anyone succeed at resolving this issue? |
Hey, did anyone succeed at resolving this issue? |
has anyone solved it :) |
I successfully convert pytorch model to onnx one.
No better method, just step by step debug. |
Could you please share some insights about this? That will be quite helpful. |
self.Transformation = 'None'
…----- 原始邮件 -----
发件人: "aisensiy" <[email protected]>
收件人: "clovaai/deep-text-recognition-benchmark" <[email protected]>
抄送: "Xuhua Ren" <[email protected]>, "Mention" <[email protected]>
发送时间: 星期四, 2020年 5 月 14日 下午 4:22:48
主题: Re: [clovaai/deep-text-recognition-benchmark] [Convert to onnx] (#76)
@xuhuaren
```
self.workers = 1
self.batch_size = 192
self.onnx = os.path.join(main_path, "squeezenet1_1.onnx")
self.batch_max_length = 25
self.imgH = 32
self.imgW = 100
self.rgb = False
self.character = '0123456789abcdefghijklmnopqrstuvwxyz'
self.sensitive = False
self.PAD = False
self.Transformation = 'TPS'
self.FeatureExtraction = 'ResNet'
self.SequenceModeling = 'BiLSTM'
self.Prediction = 'Attn'
self.num_fiducial = 20
self.input_channel = 1
self.output_channel = 512
self.hidden_size = 256
self.num_gpu = 0
```
Could you please share some insights about this? That will be quite helpful.
--
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub:
#76 (comment)
|
Wow, thanks for your super fast reply! You mean not |
sub_hidden = torch.unsqueeze(hidden[0], 1)
output_hiddens = torch.cat((output_hiddens[:, :i, :], sub_hidden), dim=1)
disable the TPS module.
…----- 原始邮件 -----
发件人: "cowyyy" <[email protected]>
收件人: "clovaai/deep-text-recognition-benchmark" <[email protected]>
抄送: "Xuhua Ren" <[email protected]>, "Mention" <[email protected]>
发送时间: 星期二, 2020年 5 月 19日 下午 5:36:12
主题: Re: [clovaai/deep-text-recognition-benchmark] [Convert to onnx] (#76)
@xuhuaren @aisensiy
Hi, I'm also confronted with the problem of onnx export.
1 TPS has f.grid_sample which has no op in onnx
2 'Attn' has a loop
could you please share some insights about this?
--
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub:
#76 (comment)
|
@xuhuaren So this cannot be ported with the TPS module? |
@xuhuaren could you share your code of export onnx |
Code would be very helpful, i have difficulties with this as well |
Yes, please share the code. |
@xuhuaren could you please tell me the version of your pytorch and cuda? As I just tried with the config like yours, there comes an error |
I am trying to convert model to onnx or TRT. Anyone tried and succeeded ? |
Have anyone tried to convert .pth to .onnx?
I tried, but outputs were unchanged for difference inputs.
The text was updated successfully, but these errors were encountered: