Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to register an onnx-graphsurgery(not-pytorch) custom operator to onnx? #241

Open
dedoogong opened this issue Apr 2, 2021 · 0 comments
Labels

Comments

@dedoogong
Copy link

dedoogong commented Apr 2, 2021

Ask a Question

Question

I have converted TF model to onnx which contains some unsupported ops with a tool as below;

python3 -m tf2onnx.convert --graphdef frozen_model_v2 --output aslfeatv2.onnx

Then, as I need to run it on TensorRT, first I needed to check the converted onnx file are correct.
So, I implemented those unsupported ops with C++/CUDA for onnx-runtime and built, tested.
But in onnx, it still fails to recognize the custom onnx-runtime ops when running onnx-simplifier

....
....
onnx.checker.check_model(model)
  File "/home/lee/.local/lib/python3.6/site-packages/onnx/checker.py", line 102, in check_model
    C.check_model(protobuf_string)
onnx.onnx_cpp2py_export.checker.ValidationError: No Op registered for ASLFeatPluginX with domain_version of 13

I've read all tutorials related to the custom ops or pytorch custom ops but nothing can help me yet.

Can anybody tell me any hint for surviving this situation?

Thank you

onnx.onnx_cpp2py_export.checker.ValidationError: No Op registered for ASLFeatPluginX with domain_version of 13
"""

Further information

Notes

Any additional information, code snippets.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant