Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

interp op attributes error when converting onnx resize op. #1233

Open
un-knight opened this issue Dec 14, 2021 · 1 comment
Open

interp op attributes error when converting onnx resize op. #1233

un-knight opened this issue Dec 14, 2021 · 1 comment
Labels

Comments

@un-knight
Copy link
Contributor

错误描述

将onnx resize op转换成tengine interp op之后tengine interp op attributes数值不正确。

待转换的onnx模型:
image

转换后的tengine interp op attributes:
image

可以发现转换后的tengine interp op的output_widthwidth_scale均出现错误。预期的output_width是20,而预期的width_scale是2。

环境

python == 3.8.12
pytorch == 1.10.0
tengine: https://github.com/OAID/Tengine/commit/91db3706e772568e022fcfcbef66d1998251988f

相关代码

resize 模型导出代码:

import torch
import torch.nn as nn

class MyModule(torch.nn.Module):
    def __init__(self) -> None:
        super().__init__()
        self.resize = nn.Upsample(scale_factor=2, mode='nearest')

    def forward(self, x):
        return self.resize(x)

model = MyModule()
torch.onnx.export(
                model,
                torch.randn((1, 3, 10, 10)),
                './toy.onnx',
                input_names=['input'],
                output_names=["output"],
                opset_version=10
            )

分析

经过debug发现在onnx2tengine的load_graph_nodeload_resize步骤中正确写入了height_scale与width_scale。

else if (onnx_node.input_size() == 2) // opset 10
{
const std::string& input_name = onnx_node.input(1);
ir_tensor_t* tensor = find_tensor(graph, input_name);
float* data = (float*)tensor->data;
interp_param->height_scale = data[2];
interp_param->width_scale = data[3];
}

但是在后续的optimize_graph的interp infer_shape环节再次读取op.param_mem发现其中的width_scale数据就出现异常。

所以推测在load_graph_nodeoptimize_graph之间某些操作影响到了interp op param_mem中的width_scale数据。

@BUG1989 BUG1989 added the bug label Jan 5, 2022
@Gitkingly
Copy link

Gitkingly commented Sep 28, 2022

这不是tengine的bug,是netron的bug,lutzroeder/netron#973
已经修了。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants